
Sebastian Goldt
@sebastiangoldt
Theory of neural networks @Sissaschool
ID: 746823020
https://datascience.sissa.it/research-unit/12/theory-of-neural-networks 09-08-2012 08:19:53
293 Tweet
973 Followers
464 Following



At NeurIPS Conference until Sunday. Come to my poster Wed, 4:30 PM (neurips.cc/virtual/2024/p…). Also reach out if you want to chat about hierarchical data structures and theory of deep learning!


I just landed in Vancouver to present NeurIPS Conference the findings of our new work! Few-shot learning and fine-tuning change the hidden layers of LLMs in a dramatically different way, even when they perform equally well on multiple-choice question-answering tasks. 🧵1/6


New paper with Leon and Erin Grant! Why do we see localized receptive fields so often, even in models without sparisity regularization? We present a theory in the minimal setting from Alessandro Ingrosso and Sebastian Goldt

Really cool work by Leon, Andrew Saxe and Erin Grant !



🤖🚀 Announcing 🚀🤖 2025 Princeton ML Theory Summer School August 12-21, 2025 (in person) Lecturers: Krzakala Florent (EPFL) Jianfeng Lu (Duke) Theodor Misiakiewicz (Yale) Yue Lu (Harvard) Yuri Polanskiy (MIT) Apply by March 31 mlschool.princeton.edu








Really happy to see this paper out, led by Nishil Patel in collaboration with Stefano Sarao Mannelli and Andrew Saxe: we apply the statistical physics toolbox to analyse a simple model of reinforcement learning, and find some cool effects, like a speed-accuracy trade-off for generalisation 🚀