Boris Dayma πŸ–οΈ (@borisdayma) 's Twitter Profile
Boris Dayma πŸ–οΈ

@borisdayma

πŸ–οΈ Founder of Craiyon
πŸ₯‘ Author of dalle-mini

ID: 506284230

linkhttps://www.craiyon.com/ calendar_today27-02-2012 19:26:54

2,2K Tweet

14,14K Followers

352 Following

Boris Dayma πŸ–οΈ (@borisdayma) 's Twitter Profile Photo

Blog posts / Articles are now divided between: - boilerplate written by ChatGPT - great content not known yet by ChatGPT (but maybe assisted by it) ChatGPT will lead to both better content and massive flooding of trash. Challenge for search/AI is to differentiate them.

Evan Walters (@evaninwords) 's Twitter Profile Photo

Erfanzar's JAX deep learning library EasyDel just got a huge upgrade! Flax NNX, base trainer, custom kernels, quantization, vision model training, and more πŸ”₯ Can't wait to try it out! github.com/erfanzar/EasyD…

Boris Dayma πŸ–οΈ (@borisdayma) 's Twitter Profile Photo

Caught up on the new SigLip 2 paper πŸ€“ Cool things I learnt from it: - finally using patch size of 16 with size multiple of 256! I was annoyed with weird patch 14 / 224 sizes which felt like old ImageNet training augmentation artifact - reference of LocCa for pretraining which

Boris Dayma πŸ–οΈ (@borisdayma) 's Twitter Profile Photo

From OpenAI image model pricing we know the number of tokens per image quality setting: - small -> 32x32 tokens (same as Dalle-1) - medium -> 64x64 tokens - large -> 128x128 tokens Doing some tests on small setting we can see they have a pretty good image encoder. Also quite

Boris Dayma πŸ–οΈ (@borisdayma) 's Twitter Profile Photo

It's tricky to understand the level of "intelligence" of an AI model. Sometimes you get pleasantly surprised on fixing bugs, doing research, etc. Other times simple tasks for humans fail (count "r" in strawberry, make me a comparison table based on these 2 web pages).