Angéline Pouget (@angelinepouget) 's Twitter Profile
Angéline Pouget

@angelinepouget

Research Engineer at Google DeepMind

ID: 1295362055621443586

linkhttps://angelinepouget.github.io/ calendar_today17-08-2020 14:09:28

22 Tweet

131 Followers

174 Following

Lucas Beyer (bl16) (@giffmana) 's Twitter Profile Photo

PSA: Stop pretraining your VLMs on EN-filtered data, even if it improves ImageNet and COCO‼️ Doing so impairs the model's understanding of non-English cultures❗️ I argued for years, now finally publish concrete results for this (imo) intuitively obvious recommendation A🧾🧶

Lucas Beyer (bl16) (@giffmana) 's Twitter Profile Photo

This work was carried by Angéline Pouget. She's a hidden gem. She's doing her masters now and will be on the PhD/Industry research market in ~6mo. I hope she'll join us :) All authors made significant contributions. It was a long, intense, fun project. arxiv.org/abs/2405.13777

This work was carried by <a href="/angelinepouget/">Angéline Pouget</a>. She's a hidden gem. She's doing her masters now and will be on the PhD/Industry research market in ~6mo. I hope she'll join us :)

All authors made significant contributions. It was a long, intense, fun project.

arxiv.org/abs/2405.13777
Andreas Steiner (@andreaspsteiner) 's Twitter Profile Photo

🚀🚀PaliGemma 2 is our updated and improved PaliGemma release using the Gemma 2 models and providing new pre-trained checkpoints for the full cross product of {224px,448px,896px} resolutions and {3B,10B,28B} model sizes. 1/7

🚀🚀PaliGemma 2 is our updated and improved PaliGemma release using the Gemma 2 models and providing new pre-trained checkpoints for the full cross product of {224px,448px,896px} resolutions and {3B,10B,28B} model sizes.

1/7
Ibrahim Alabdulmohsin | إبراهيم العبدالمحسن (@ibomohsin) 's Twitter Profile Photo

Attending #NeurIPS2024? If you're interested in multimodal systems, building inclusive & culturally aware models, and how fractals relate to LLMs, we've 3 posters for you. I look forward to presenting them on behalf of our GDM team @ Zurich & collaborators. Details below (1/4)

Ibrahim Alabdulmohsin | إبراهيم العبدالمحسن (@ibomohsin) 's Twitter Profile Photo

🔥Excited to introduce RINS - a technique that boosts model performance by recursively applying early layers during inference without increasing model size or training compute flops! Not only does it significantly improve LMs, but also multimodal systems like SigLIP. (1/N)

🔥Excited to introduce RINS - a technique that boosts model performance by recursively applying early layers during inference without increasing model size or training compute flops! Not only does it significantly improve LMs, but also multimodal systems like SigLIP. 
(1/N)
Michael Tschannen (@mtschannen) 's Twitter Profile Photo

📢2⃣ Yesterday we released SigLIP 2! TL;DR: Improved high-level semantics, localization, dense features, and multilingual capabilities via drop-in replacement for v1. Bonus: Variants supporting native aspect and variable sequence length. A thread with interesting resources👇

📢2⃣ Yesterday we released SigLIP 2! 

TL;DR: Improved high-level semantics, localization, dense features, and multilingual capabilities via drop-in replacement for v1.

Bonus: Variants supporting native aspect and variable sequence length.

A thread with interesting resources👇
Google DeepMind (@googledeepmind) 's Twitter Profile Photo

Think you know Gemini? 🤔 Think again. Meet Gemini 2.5: our most intelligent model 💡 The first release is Pro Experimental, which is state-of-the-art across many benchmarks - meaning it can handle complex problems and give more accurate responses. Try it now →