Luca Garello (@lucagarelloit) 's Twitter Profile
Luca Garello

@lucagarelloit

Postdoctoral Researcher at Italian Institute of Technology.
Passionate about AI and Cognitive robotics
bento.me/lucagarello

ID: 82874350

calendar_today16-10-2009 13:11:36

432 Tweet

167 Followers

354 Following

Yann LeCun (@ylecun) 's Twitter Profile Photo

Current LLMs are trained on text data that would take 20,000 years for a human to read. And still, they haven't learned that if A is the same as B, then B is the same as A. Humans get a lot smarter than that with comparatively little training data. Even corvids, parrots, dogs,

Andrej Karpathy (@karpathy) 's Twitter Profile Photo

People have too inflated sense of what it means to "ask an AI" about something. The AI are language models trained basically by imitation on data from human labelers. Instead of the mysticism of "asking an AI", think of it more as "asking the average data labeler" on the

@levelsio (@levelsio) 's Twitter Profile Photo

Self driving cars can react 150x faster than a human driver Human driver: - reaction time 0.3 seconds - execution time 1-2 seconds Self driving car: - reaction time 0.001 seconds - execution time 0.1 seconds

Luca Garello (@lucagarelloit) 's Twitter Profile Photo

The biggest breakthroughs in robotics will not come from adding more sensors, motors or complexity. They will come from reimagining what's possible with existing technology.

Harrison Kinsley (@sentdex) 's Twitter Profile Photo

Imagine for a moment that Deepseek did train with OpenAI outputs. Now what? Does OpenAI have real IP or copyright claim to model outputs? If so, does OpenAI have real IP or copyright claim to all the code you've written with it? Actions taken? businesses built?

Andrej Karpathy (@karpathy) 's Twitter Profile Photo

We're missing (at least one) major paradigm for LLM learning. Not sure what to call it, possibly it has a name - system prompt learning? Pretraining is for knowledge. Finetuning (SL/RL) is for habitual behavior. Both of these involve a change in parameters but a lot of human

RBCS & CONTACT IIT (@rbcs_contact) 's Twitter Profile Photo

Look ma, we are on TV 📺 During #RomeCup 2025 Luca Garello was interviewed by the national tv Rai2, to describe how infants' development inspires #iCub's learning 👶🤖 Full video here 🔗 raiplay.it/video/2025/05/… or find the excerpt below 🎞️ (🇮🇹 only) IIT

Devanshi (@bavariadevanshi) 's Twitter Profile Photo

LLMs are a type of AI model, but not all AI models are LLMs. Here are eight cutting-edge architectures that extend traditional AI, enhancing understanding, reasoning, and generation across domains and modalities.

Andrej Karpathy (@karpathy) 's Twitter Profile Photo

The race for LLM "cognitive core" - a few billion param model that maximally sacrifices encyclopedic knowledge for capability. It lives always-on and by default on every computer as the kernel of LLM personal computing. Its features are slowly crystalizing: - Natively multimodal