
Robert Bamler
@robamler
Professor of Data Science and Machine Learning at @uni_tue, member of @ml4science and Tübingen AI Center.
ID: 397152500
https://robamler.github.io 24-10-2011 09:36:42
34 Tweet
230 Followers
25 Following




How do LLMs connect to modern computers in zero-shot problem solving abilities and histories? Our latest blog post provides a fresh perspective on understanding LLMs and the prompting paradigm. Check it out! timx.me/blog/2023/comp… Weiyang Liu Robert Bamler #ChatGPT



The 2nd iteration of the "Neural Compression: From Information Theory to Applications" workshop will take place ICML Conference in Hawaii this year! Submissions due May 27th. For more details:neuralcompression.github.io/workshop23 Berivan Isik Yibo Yang Daniel Severo Dr. Karen Ullrich Robert Bamler @s_mandt

My student Johannes Zenn found a useful fact about differential sequential Monte Carlo samplers: you can ignore any gradients due to resampling because they vanish in expectation. Check out his accepted ICLR DEI paper and meet us at the poster on Friday. arxiv.org/abs/2304.14390

There is no need for gradients due to resampling in Differentiable Sequential Monte Carlo Samplers! Check out our recent work (arxiv.org/abs/2304.14390) with Robert Bamler and meet us at the poster on Friday!


If you're at ICLR, join my student Johannes Zenn at the Tiny Paper poster session today from 1.15 to 3.15 pm in room MH4. You'll be surprised how many insights can fit in a 2-page paper! arxiv.org/abs/2304.14390



🚀Discover how VML enhances interpretability! This is a joint work with Tim Xiao, Robert Bamler and Bernhard Schölkopf. 🧵 4/4



How could online learning apps adapt to learners and improve over time? Even if you're not a machine learning expert, Hanqi Zhou's blog post on our last ICLR paper explains new approaches in simple terms (joint work with 🐘 Álvaro Tejero-Cantero @[email protected] and Charley Wu (@thecharleywu.bsky.social), supported by Theresa Authaler).

