
Maite Melero
@maitemelero1
ID: 809736524887683072
16-12-2016 12:26:46
636 Tweet
186 Followers
232 Following







[2/7] Along with the paper we release PLUME a family of 3 2B #LLM based on the Gemma architecture. Each model uses a different vocabulary size, from 32k up to 256k tokens. PLUME 32k: huggingface.co/projecte-aina/… PLUME 128k: huggingface.co/projecte-aina/… PLUME 256k: huggingface.co/projecte-aina/…





[7/7] This work has been conducted at BSC-CNS thanks to funding by Aina and Proyecto Ilenia. Also, thank my co-authors Javier García Gilabert (Javier García Gilabert), Aleix Sant Savall, Francesca De Luca Fornaciari, Audrey Mash, Xixian Liao, Maite Melero (Maite Melero )







L'Alternativa 2024 - ALTO EL FUEGO alternativa.cccb.org/2024/es/fest/c… via L'Alternativa Fest
