Charlotte Caucheteux @ICML24 (@c_caucheteux) 's Twitter Profile
Charlotte Caucheteux @ICML24

@c_caucheteux

Research Scientist @MetaAI | Deep Learning | Large Language Modelling | Cognitive Neuroscience

ID: 1267831603289444355

linkhttps://charlottecaucheteux.github.io/ calendar_today02-06-2020 14:53:12

47 Tweet

1,1K Followers

251 Following

Guillaume Lample @ NeurIPS 2024 (@guillaumelample) 's Twitter Profile Photo

Today we release LLaMA, 4 foundation models ranging from 7B to 65B parameters. LLaMA-13B outperforms OPT and GPT-3 175B on most benchmarks. LLaMA-65B is competitive with Chinchilla 70B and PaLM 540B. The weights for all models are open and available at research.facebook.com/publications/l… 1/n

Today we release LLaMA, 4 foundation models ranging from 7B to 65B parameters.
LLaMA-13B outperforms OPT and GPT-3 175B on most benchmarks. LLaMA-65B is competitive with Chinchilla 70B and PaLM 540B.
The weights for all models are open and available at research.facebook.com/publications/l…
1/n
Gautier Izacard (@gizacard) 's Twitter Profile Photo

Happy to release a collection of LLaMA 🦙, large language models ranging from 7B to 65B parameters and trained on publicly available datasets. LLaMA-65B is competitive with Chinchilla and PaLM. Paper: tinyurl.com/ycxr2mvj

Alexandre Défossez (@honualx) 's Twitter Profile Photo

We release stereo models for all MusicGen variants (+ a new large melody both mono and stereo): 6 new models available on HuggingFace (thanks Vaibhav (VB) Srivastav). We show how a simple fine tuning procedure with codebook interleaving takes us from boring mono to immersive stereo🎧👇