Adeesh Kolluru (@adeeshkolluru) 's Twitter Profile
Adeesh Kolluru

@adeeshkolluru

AI for Materials @radicalai_inc; Prev : PhD @CarnegieMellon; Intern @SamsungSemiUS @OrbMaterials @MetaAI (@OpenCatalyst), UG @IITDelhi

ID: 561766981

linkhttp://adeeshkolluru.github.io calendar_today24-04-2012 05:56:56

176 Tweet

400 Followers

211 Following

Aditi Krishnapriyan (@ask1729) 's Twitter Profile Photo

1/ What are key design principles for scaling neural network interatomic potentials? Our exploration leads us to top results on the Open Catalyst Project (OC20, OC22), SPICE, and MPTrj, with vastly improved efficiency! Accepted at #NeurIPS2024: arxiv.org/abs/2410.24169

1/ What are key design principles for scaling neural network interatomic potentials? Our exploration leads us to top results on the Open Catalyst Project (OC20, OC22), SPICE, and MPTrj, with vastly improved efficiency!

Accepted at #NeurIPS2024: arxiv.org/abs/2410.24169
Anuroop Sriram (@anuroopsriram) 's Twitter Profile Photo

I’m excited to share our latest work on generative models for materials called FlowLLM. FlowLLM combines Large Language Models and Riemannian Flow Matching in a simple, yet surprisingly effective way for generating materials. arxiv.org/abs/2410.23405 Benjamin Kurt Miller Ricky T. Q. Chen Brandon Wood

Learning on Graphs Conference 2024 (@logconference) 's Twitter Profile Photo

✨EXCITING NEWS! Registration for the 3rd Learning on Graphs conference is now open 😊 It is virtual, free to attend, livestreamed, and recorded 📹 Sign up today! forms.gle/eYiDCopJGUc8px…

✨EXCITING NEWS! Registration for the 3rd Learning on Graphs conference is now open 😊

It is virtual, free to attend, livestreamed, and recorded 📹 Sign up today!

forms.gle/eYiDCopJGUc8px…
Luis Müller (@luis_pupuis) 's Twitter Profile Photo

Interested in the expressivity of higher-order attention? In our NeurIPS 2024 paper, we show that the Edge Transformer, originally designed for systematic generalization, has 3-WL expressivity and achieves SOTA performance on molecular regression and algorithmic reasoning tasks.

Interested in the expressivity of higher-order attention? In our NeurIPS 2024 paper, we show that the Edge Transformer, originally designed for systematic generalization, has 3-WL expressivity and achieves SOTA performance on molecular regression and algorithmic reasoning tasks.
Learning on Graphs Conference 2024 (@logconference) 's Twitter Profile Photo

LoG 2024 paper decision notifications will be delayed until Saturday, 16 November. The notifications were originally meant to go out today AoE, but the committee need a few extra days to prepare the best program. Apologies to LoG 2024 authors.

Jehad Abed (@jehad__abed) 's Twitter Profile Photo

Excited to unveil OCx24, a two-year effort with University of Toronto and @VSParticle! We've synthesized and tested in the lab hundreds of metal alloys for catalysis. With 685 million AI-accelerated simulations, we analyzed 20,000 materials to try and bridge simulation and reality. Paper:

Radical AI (@radicalai_inc) 's Twitter Profile Photo

Discovering and developing a novel material used to take 15-25 years and over $100M for a single materials system. Not anymore. Radical AI – the AI and Autonomy driven materials discovery company built for breakthrough technological innovations for the world’s most important

Thomas Wolf (@thom_wolf) 's Twitter Profile Photo

I shared a controversial take the other day at an event and I decided to write it down in a longer format: I’m afraid AI won't give us a "compressed 21st century". The "compressed 21st century" comes from Dario's "Machine of Loving Grace" and if you haven’t read it, you probably

Chaitanya K. Joshi @ICLR2025 🇸🇬 (@chaitjo) 's Twitter Profile Photo

Introducing All-atom Diffusion Transformers — towards Foundation Models for generative chemistry, from my internship with the FAIR Chemistry team FAIR Chemistry AI at Meta There are a couple ML ideas which I think are new and exciting in here 👇

Introducing All-atom Diffusion Transformers 

— towards Foundation Models for generative chemistry, from my internship with the FAIR Chemistry team <a href="/OpenCatalyst/">FAIR Chemistry</a> <a href="/AIatMeta/">AI at Meta</a> 

There are a couple ML ideas which I think are new and exciting in here 👇
Aditi Krishnapriyan (@ask1729) 's Twitter Profile Photo

1/ Machine learning force fields are hot right now 🔥: models are getting bigger + being trained on more data. But how do we balance size, speed, and specificity? We introduce a method for doing model distillation on large-scale MLFFs into fast, specialized MLFFs!

Adeesh Kolluru (@adeeshkolluru) 's Twitter Profile Photo

🚨 Checkout our new MD simulation package! ✅ Completely written in PyTorch ✅ Handles batched simulations ✅ Easy to add new MLIPs Led by Abhijeet Gangan and Orion Archer Cohen Spoiler alert: New MLIP coming soon! Code (MIT License): github.com/Radical-AI/tor…

Joseph Krause (@josephfkrause) 's Twitter Profile Photo

ML-first materials are here: we demonstrate that machine learning approaches to materials prediction achieves state-of-the-art accuracy and best-in-class efficiency and performance. Adeesh Kolluru is leading the charge Radical AI

Muhammed Shuaibi (@mshuaibii) 's Twitter Profile Photo

Excited to share our latest releases to the FAIR Chemistry’s family of open datasets and models: OMol25 and UMA! AI at Meta FAIR Chemistry OMol25: huggingface.co/facebook/OMol25 UMA: huggingface.co/facebook/UMA Blog: ai.meta.com/blog/meta-fair… Demo: huggingface.co/spaces/faceboo…