
Weinan Sun
@sunw37
Neuroscience, Artificial Intelligence, and Beyond.
Assistant professor, Neurobiology and Behavior @CornellNBB
ID: 702739245195096065
25-02-2016 06:17:47
874 Tweet
779 Followers
612 Following






Want to procedurally generate large-scale relational reasoning experiments in natural language, to study human psychology 🧠 or eval LLMs 🤖? We have a tool for that! github.com/google-deepmin… Check out Kenneth Marino's thread for some stuff you can do:


.Chongxi Lai works at the intersection of neuroscience, AI, and brain-machine interfaces. He researches building brain-like models in a simulated environment, to test whether cognition can be enhanced through novel AI-assisted BMI closed-loop stimulation algorithms.

Thrilled to announce I've joined Astera Institute's first residency cohort! Excited to collaborate with this amazing team to build technology for a brighter future! I will focus on building and testing brain-like model in large-scale simulation and use AI to enhance it!


Our latest study identifies a specific cell type and receptor essential for psilocybin’s long-lasting neural and behavioral effects 🍄🔬🧠🐁 Led by Ling-Xiao Shao and Clara Liao Funded by NIH National Institute of Mental Health (NIMH) 📄Read in nature - nature.com/articles/s4158… 1/12





This preprint is now published at nature. With current and former DeepMinders Yuval Tassa, Josh Merel, Matt Botvinick, and my HHMI | Janelia colleagues Roman Vaxenburg, Igor Siwanowicz, Kristin Branson @[email protected], Michael Reiser | michaelreiser.bsky.social, Gwyneth Card and more

Cool work from HHMI | Janelia .. "cognitive graphs of latent structure" .... Looks like even more evidence for CSCG-like representation and schemas. (science.org/doi/10.1126/sc…, arxiv.org/abs/2302.07350) biorxiv.org/content/10.110…

How does in-context learning emerge in attention models during gradient descent training? Sharing our new Spotlight paper ICML Conference: Training Dynamics of In-Context Learning in Linear Attention arxiv.org/abs/2501.16265 Led by Yedi Zhang with Aaditya Singh and Peter Latham

New paper: World models + Program synthesis by Wasu Top Piriyakulkij 1. World modeling on-the-fly by synthesizing programs w/ 4000+ lines of code 2. Learns new environments from minutes of experience 3. Positive score on Montezuma's Revenge 4. Compositional generalization to new environments