Guan-Horng Liu (@guanhorng_liu) 's Twitter Profile
Guan-Horng Liu

@guanhorng_liu

Research Scientist @MetaAI (FAIR NY) • Schrödinger Bridge, diffusion, flow, stochastic optimal control • prev ML PhD @GeorgiaTech 🚀

ID: 908526953061392385

linkhttp://ghliu.github.io calendar_today15-09-2017 03:04:40

234 Tweet

1,1K Followers

362 Following

Xiang Fu (@xiangfu_ml) 's Twitter Profile Photo

We have released an eSEN model that is the current SOTA on Matbench-Discovery. Code/checkpoints are available for both non-commercial and commercial use: code: github.com/facebookresear… checkpoint: huggingface.co/facebook/OMAT24 paper (updated): arxiv.org/abs/2502.12147

Guan-Horng Liu (@guanhorng_liu) 's Twitter Profile Photo

📢#Adjoint #Sampling is a new Diffusion Sampler for Boltzmann distribution that - Grounded on stochastic control - Enjoy scalable matching objective - Extremely efficient in energy NFE - Does NOT require/estimate target data Checkout Aaron Havens talk on Monday #FPI workshop!

Carles Domingo-Enrich (@cdomingoenrich) 's Twitter Profile Photo

🚀Excited to open source the code for Adjoint Matching --- as part of a new repo centered around reward fine-tuning via stochastic optimal control! github.com/microsoft/soc-…

Ricky T. Q. Chen (@rickytqchen) 's Twitter Profile Photo

Against conventional wisdom, I will be giving a talk with particular focus on the "how" and the various intricacies of applying stochastic control for generative modeling. Mon 9:50am Hall 1 Apex #ICLR2025 Also check out the other talks at delta-workshop.github.io!

Against conventional wisdom, I will be giving a talk with particular focus on the "how" and the various intricacies of applying stochastic control for generative modeling.

Mon 9:50am Hall 1 Apex #ICLR2025 

Also check out the other talks at delta-workshop.github.io!
AI at Meta (@aiatmeta) 's Twitter Profile Photo

Announcing the newest releases from Meta FAIR. We’re releasing new groundbreaking models, benchmarks, and datasets that will transform the way researchers approach molecular property prediction, language processing, and neuroscience. 1️⃣ Open Molecules 2025 (OMol25): A dataset

Ricky T. Q. Chen (@rickytqchen) 's Twitter Profile Photo

Padding in our non-AR sequence models? Yuck. 🙅 👉 Instead of unmasking, our new work *Edit Flows* perform iterative refinements via position-relative inserts and deletes, operations naturally suited for variable-length sequence generation. Easily better than using mask tokens.

Rianne van den Berg (@vdbergrianne) 's Twitter Profile Photo

🚀 After two+ years of intense research, we’re thrilled to introduce Skala — a scalable deep learning density functional that hits chemical accuracy on atomization energies and matches hybrid-level accuracy on main group chemistry — all at the cost of semi-local DFT. ⚛️🔥🧪🧬

🚀 After two+ years of intense research, we’re thrilled to introduce Skala — a scalable deep learning density functional that hits chemical accuracy on atomization energies and matches hybrid-level accuracy on main group chemistry — all at the cost of semi-local DFT. ⚛️🔥🧪🧬
Itai Gat (@itai_gat) 's Twitter Profile Photo

Excited to share our recent work on corrector sampling in language models! A new sampling method that mitigates error accumulation by iteratively revisiting tokens in a window of previously generated text. With: Neta Shaul Uriel Singer Yaron Lipman Link: arxiv.org/abs/2506.06215

Excited to share our recent work on corrector sampling in language models! A new sampling method that mitigates error accumulation by iteratively revisiting tokens in a window of previously generated text.
With: <a href="/shaulneta/">Neta Shaul</a> <a href="/urielsinger/">Uriel Singer</a> <a href="/lipmanya/">Yaron Lipman</a>
Link: arxiv.org/abs/2506.06215
Yaron Lipman (@lipmanya) 's Twitter Profile Photo

A new paper: We finetune an LLM to rethink and resample previously generated tokens, allowing to reduce sampling errors and improve performance.

A new paper: We finetune an LLM to rethink and resample previously generated tokens, allowing to reduce sampling errors and improve performance.
Ricky T. Q. Chen (@rickytqchen) 's Twitter Profile Photo

This new work generalizes the recent Adjoint Sampling approach from Stochastic Control to Schrodinger Bridges, enabling measure transport between data and unnormalized densities. Achieves SOTA on large-scale energy-driven conformer generation. See thread by Guan-Horng Liu

Benjamin Kurt Miller (@bkmi13) 's Twitter Profile Photo

Cool work led by Guan-Horng Liu! Removing the restriction on memoryless SDEs enables a lot of relevant cases in chemistry and more... also better results! Take advantage of the freedom of flow & bridge matching to choose a base dist & learn from energy alone! No more data!

Yaron Lipman (@lipmanya) 's Twitter Profile Photo

**Transition Matching** is a new iterative generative paradigm using Flow Matching or AR models to transition between generation intermediate states, leading to an improved generation quality and speed!

Kirill Neklyudov (@k_neklyudov) 's Twitter Profile Photo

1/ Where do Probabilistic Models, Sampling, Deep Learning, and Natural Sciences meet? 🤔 The workshop we’re organizing at #NeurIPS2025! 📢 FPI@NeurIPS 2025: Frontiers in Probabilistic Inference – Learning meets Sampling Learn more and submit → fpiworkshop.org

Alex Tong (@alexandertong7) 's Twitter Profile Photo

Thrilled to be co-organizing FPI at #NeurIPS2025! I'm particularly excited about our new 'Call for Open Problems'track. If you have a tough, cross-disciplinary challenge, we want you to share it and inspire new collaborations. A unique opportunity! Learn more below.

Joey Bose (@bose_joey) 's Twitter Profile Photo

🚨 Our workshop on Frontiers of Probabilistic Inference: Learning meets Sampling got accepted to #NeurIPS2025!! After the incredible success of the first edition. The second edition is aimed to be bolder, bigger, and more ambitious in outlining key challenges in the natural

Guan-Horng Liu (@guanhorng_liu) 's Twitter Profile Photo

📢 We're organizing a #NeurIPS2025 workshop on generative modeling, learning to sample, and optimal transport / control. Two submission tracks this year! (deadline #Aug22) 📰 Call for 4-page paper on research / dataset 💡 Call for #Open #Question: 2-page proposal on open

Benjamin Kurt Miller (@bkmi13) 's Twitter Profile Photo

The FAIR Chemistry team at Meta is hiring an RS. We're looking for experience in generative modeling, sampling, or representation learning. Also a real interest in tackling practical computational chemistry challenges! Apply metacareers.com/jobs/134955460… Questions [email protected]

Ricky T. Q. Chen (@rickytqchen) 's Twitter Profile Photo

The FAIR Chemistry team has been open sourcing a lot of great stuff recently (data & foundation models), and they are looking to go more generative! This is a rare job opportunity for exploring the frontiers of generative modeling and chemistry. Reach out to Ben if interested!