yingzhen (@liyzhen2) 's Twitter Profile
yingzhen

@liyzhen2

teaching machines 🤖 to learn 🔍 and fantasise 🪄
now @ImperialCollege @ICComputing
ex @MSFTResearch @CambridgeMLG
currently helping @aistats_conf

ID: 559572885

linkhttp://yingzhenli.net calendar_today21-04-2012 12:26:18

535 Tweet

3,3K Followers

155 Following

Transactions on Machine Learning Research (@tmlrorg) 's Twitter Profile Photo

🎉Announcing... the 2024 TMLR Outstanding Certifications! (aka, our "best paper" awards!) Are you bursting with anticipation to see what they are? Well Twitter link down-weighting requires us to keep you in suspense, so read down-thread to see what they are!! 🎉🧵👇1/n

yingzhen (@liyzhen2) 's Twitter Profile Photo

Now people get interested "inference-time compute". "inference" -- computing samples & statistics of an unnormalised prob density -- has always been a key research topic of prob ML😉 No data, always about balancing comp. efficiency & accuracy Good times ahead in 2025

yingzhen (@liyzhen2) 's Twitter Profile Photo

Tons of papers re diffusion/flow matching at ML confs these days, but to my surprise very few of them consider learning the prior🤔 Am I missing any important work here? 🙏 for suggestions

James Allingham (@jamesallingham) 's Twitter Profile Photo

Eagerly waiting for #ICLR2025 or #AISTATS2025 results? Are you working on probabilistic #ML, #GenAI, approximate methods, or #UQ? Then check out #AABI held on Apr 29 co-located with #ICLR2025 + Fast-Track for accepted ICLR/AISTATS submissions -> approximateinference.org

yingzhen (@liyzhen2) 's Twitter Profile Photo

Very exciting new initiative led by two great statisticians, Pierre and Kamelia! I've always hoped that ML and traditional stats community can talk to each other more. Starting from #AISTATS2025 , #AISTATS will become a pioneering ML conf for this👍

yingzhen (@liyzhen2) 's Twitter Profile Photo

RNN memory (HiPPO 🦛 style, predecessor to S4/Mamba 🐍) for posterior over functions When my awesome students told me you can build a memory for random functions that you don’t even observe I was like 🤯 Preliminary but exciting, feedback welcome 🤗

yingzhen (@liyzhen2) 's Twitter Profile Photo

A little chapter that we (Ruqi Zhang and awesome students and yours truly) wrote a while ago to give a brief intro of this nice field to statisticians 😊

yingzhen (@liyzhen2) 's Twitter Profile Photo

Filtering/smoothing methods are classic for stochastic SSMs. But any suggestions 🙏 regarding existing work on applying them to (latent) ODEs?

yingzhen (@liyzhen2) 's Twitter Profile Photo

For generating tabular data with mixed features, Jacob Si found that it's easier to embed discrete variables to certain continuous space and perform joint diffusion with continuous features. Data pre-processing really matters for tabular data generation 🔍

yingzhen (@liyzhen2) 's Twitter Profile Photo

This paper took Carles Balsells Rodas ~5 years in making: 2020/21: MSc proj, toy exp🐣 2022: added more exp, rejected due to weak theory 😥 2023/24: invented a new proof tech in another proj 🤔 2024/25: revisit, apply new proof tech, resubmit -> accepted 🥳 Persistence pays off indeed👍

AISTATS Conference (@aistats_conf) 's Twitter Profile Photo

#AISTATS2025 is off to a strong start! First keynote: Chris Holmes rethinks Bayesian inference through the lens of predictive distributions—introducing tools like martingale posteriors. 🌴🌴🤖🎓

#AISTATS2025 is off to a strong start!
First keynote: Chris Holmes rethinks Bayesian inference through the lens of predictive distributions—introducing tools like martingale posteriors. 🌴🌴🤖🎓
UCL CSML (@uclcsml) 's Twitter Profile Photo

The next seminar is this Friday (May 23rd) and starts at 12pm midday UK time! Professor yingzhen from Imperial College London is going to talk about “On Modernizing Sparse Gaussian Processes ”! ucl.zoom.us/j/99748820264 This seminar is hybrid. More info

yingzhen (@liyzhen2) 's Twitter Profile Photo

Update: now we can deal with multi-dim inputs!😆 The trick is just like how you build token sequences for SOTA RNNs applied to e.g., vision problems. If I were to shamelessly brag to deep learning people 😉: this could potentially become a Bayesian/Kernel version of S4/Mamba