Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile
Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬

@faccioai

Senior Research Scientist @GoogleDeepMind Discovery Team. Previously PhD with @SchmidhuberAI. Working on Reinforcement Learning.

ID: 1037009714406715392

linkhttps://faccio.ai/ calendar_today04-09-2018 16:09:17

37 Tweet

901 Followers

307 Following

Anand Gopalakrishnan (@agopal42) 's Twitter Profile Photo

Excited to present β€œContrastive Training of Complex-valued Autoencoders for Object Discoveryβ€œ at #NeurIPS2023. TL;DR -- We introduce architecture changes and a new contrastive training objective that greatly improve the state-of-the-art synchrony-based model. Explainer thread πŸ‘‡:

Excited to present β€œContrastive Training of Complex-valued Autoencoders for Object Discoveryβ€œ at #NeurIPS2023. TL;DR -- We introduce architecture changes and a new contrastive training objective that greatly improve the state-of-the-art synchrony-based model. Explainer thread πŸ‘‡:
Amrith Setlur (@setlur_amrith) 's Twitter Profile Photo

Excited to announce the 𝐁𝐞𝐬𝐭 𝐏𝐚𝐩𝐞𝐫 π€π°πšπ«ππ¬ (2 papers) and π‡π¨π§π¨π«πšπ›π₯𝐞 𝐌𝐞𝐧𝐭𝐒𝐨𝐧𝐬 (2 more papers) for our NeurIPS workshop R0-FoMo: Robustness of Few-shot & Zero-shot Learning in Foundation Models πŸŽ‰ Please join us in congratulating the authors πŸ‘

Excited to announce the 𝐁𝐞𝐬𝐭 𝐏𝐚𝐩𝐞𝐫 π€π°πšπ«ππ¬ (2 papers) and π‡π¨π§π¨π«πšπ›π₯𝐞 𝐌𝐞𝐧𝐭𝐒𝐨𝐧𝐬 (2 more papers) for our NeurIPS workshop R0-FoMo: Robustness of Few-shot & Zero-shot Learning in Foundation Models πŸŽ‰

Please join us in congratulating the authors πŸ‘
Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile Photo

Recently, I had the pleasure of giving a talk at Microsoft Research Asia in Beijing on 'Learning to Extract Information from Neural Networks.' Met lots of brilliant minds there! #MSFTResearch

Recently, I had the pleasure of giving a talk at Microsoft Research Asia in Beijing on 'Learning to Extract Information from Neural Networks.' Met lots of brilliant minds there! 
#MSFTResearch
Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile Photo

Can neural networks with 5000 layers improve long-term planning? πŸ€– Check out our latest research with JΓΌrgen Schmidhuber, Dylan R. Ashley, and team: arxiv.org/abs/2406.08404 #AI #DeepLearning #RL

Can neural networks with 5000 layers improve long-term planning? πŸ€–

Check out our latest research with <a href="/SchmidhuberAI/">JΓΌrgen Schmidhuber</a>, <a href="/oneDylanAshley/">Dylan R. Ashley</a>, and team: arxiv.org/abs/2406.08404

#AI #DeepLearning #RL
Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile Photo

πŸš€Want to cut inference times by up to 50% and save money when using Transformer/CNN/Consistency-based diffusion models? Check out our latest work on Faster Diffusion Through Temporal Attention Decomposition led by Haozhe Liu, featuring JΓΌrgen Schmidhuber. Paper:

πŸš€Want to cut inference times by up to 50% and save money when using Transformer/CNN/Consistency-based diffusion models?

Check out our latest work on Faster Diffusion Through Temporal Attention Decomposition led by <a href="/HaoZhe65347/">Haozhe Liu</a>, featuring <a href="/SchmidhuberAI/">JΓΌrgen Schmidhuber</a>.

Paper:
Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile Photo

Heading to #ICML2024 for a busy week with 3 posters and 2 oral presentations. If you’re interested in discussing collaborations, visiting, or hiring opportunities at Artificial Intelligence @ KAUST with JΓΌrgen Schmidhuber, feel free to connect!

Heading to #ICML2024 for a busy week with 3 posters and 2 oral presentations. 

If you’re interested in discussing collaborations, visiting, or hiring opportunities at <a href="/AI_KAUST/">Artificial Intelligence @ KAUST</a> with <a href="/SchmidhuberAI/">JΓΌrgen Schmidhuber</a>, feel free to connect!
Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile Photo

Our paper, "Scaling Value Iteration Networks to 5000 Layers for Extreme Long-Term Planning," was accepted at #EWRL. Congratulations to Yuhui Wang and the team! Paper: arxiv.org/abs/2406.08404 #AI #DeepLearning #RL

Our paper, "Scaling Value Iteration Networks to 5000 Layers for Extreme Long-Term Planning," was accepted at #EWRL.

Congratulations to Yuhui Wang and the team!

Paper: arxiv.org/abs/2406.08404

#AI #DeepLearning #RL
Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile Photo

πŸš€ Overdue launch of my personal website faccio.ai! Check out my latest AI projects. I would like to thank my ancestors for giving me a last name that means "I make" in Italian. So, yes, I make #AI πŸ€–

Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile Photo

I'm thrilled to announce that I'm joining the Discovery team at Google DeepMind in London as a Senior Research Scientist starting this January! It's incredible what the team has achieved in the past decade and I am so looking forward towards more scientific discoveries with AI.

Francesco Faccio @ ICLR2025 πŸ‡ΈπŸ‡¬ (@faccioai) 's Twitter Profile Photo

I'm in Singapore for #ICLR2025! DM me if you’d like to meet and chat about Creativity and Curiosity in AI, AGI, Agents, or exciting opportunities at Google DeepMind. You might even get a free Italian coffee β˜•οΈ :)

I'm in Singapore for #ICLR2025! DM me if you’d like to meet and chat about Creativity and Curiosity in AI, AGI, Agents, or exciting opportunities at <a href="/GoogleDeepMind/">Google DeepMind</a>. 

You might even get a free Italian coffee β˜•οΈ :)