Alex Holub (@alex_holub) 's Twitter Profile
Alex Holub

@alex_holub

CEO Vidora. Sold told mParticle.

ID: 24138844

calendar_today13-03-2009 06:27:19

948 Tweet

342 Followers

550 Following

Ascend.io (@ascend_io) 's Twitter Profile Photo

"It’s not enough to say, 'Can I create the model on a small set of data?'...It’s often underestimated how big a step it is to go from that sort of local small train model to one that’s deployed and ongoing."~@ShawnAzman from Vidora hubs.li/H0SVVjK0

Yann LeCun (@ylecun) 's Twitter Profile Photo

A clip from our Davos panel in which Daphne Koller and I talk about the saturation of LLM performance, the exhaustion of text data, and the necessity of using sensory data to get AI systems to understand the world. I make the point that training AI systems to understand the world

Yao Fu (@francis_yao_) 's Twitter Profile Photo

Over the last two days after my claim "long context will replace RAG", I have received quite a few criticisms (thanks and really appreciated!) and many of them stand a reasonable point. Here I have gathered the major counterargument, and try to address then one-by-one (feels like

Bojan Tunguz (@tunguz) 's Twitter Profile Photo

Some exciting news - Google has just announced Gemma, a family of lightweight open source LLM models based on the same research and technology as Gemini. The 2B and 7B weights have been released today. 1/4

Some exciting news - <a href="/Google/">Google</a> has just announced Gemma, a family of  lightweight open source LLM models based on the same research and technology as Gemini. The 2B and 7B weights have been released today.

1/4
Min Choi (@minchoi) 's Twitter Profile Photo

Google Gemini 1.5 Pro just changed the game with its massive 1,000,000+ token context size It doesn't just understand few pages or images, it can understand several papers, long videos, and entire codebase 🤯 10 mind blowing examples:

Google Gemini 1.5 Pro just changed the game with its massive 1,000,000+ token context size

It doesn't just understand few pages or images, it can understand several papers, long videos, and entire codebase 🤯

10 mind blowing examples:
Jason ✨👾SaaStr 2025 is May 13-15✨ Lemkin (@jasonlk) 's Twitter Profile Photo

If you have: 👉2 truly great founders, a great CEO and a great CTO 👉100% committed and get to just 👉10 paying customers and 👉 you never quit You will have some sort of success

Peter Yang (@petergyang) 's Twitter Profile Photo

Jensen Huang built Nvidia to $2.2 trillion. I watched his interviews and here's what stands out: Pure Asian dad energy. "I wish upon you ample doses of pain and suffering." Going to give this pep talk to my 5-year-old this weekend. Will let you all know how it goes.

Jason ✨👾SaaStr 2025 is May 13-15✨ Lemkin (@jasonlk) 's Twitter Profile Photo

When the “exit” comes The money rarely buys happiness When the IPO comes, it’s great, but It’s back to work the next day The happiness you are looking for, Find it on the journey, too