
Alexander Long
@_alexanderlong
Founder @PluralisHQ | prev. Applied Scientist at Amazon | ML PhD |
Protocol Learning: Multi-participant, low-bandwidth model parallel.
ID: 1684127558563151873
http://Pluralis.ai 26-07-2023 09:05:01
289 Tweet
1,1K Followers
715 Following

for those of you who haven't been keeping up, Prime Intellect completed training a 32B AI model (100% decentralized) then gensyn announced they're training a 72B model the next day, more than 2X-ing the bar. but not before zuckerberg went on Dwarkesh Patel pod espousing how


A handful of centralized AI models will soon steer what you know, think, and choose. Alexander Long walked out of Amazon and pulled eight PhDs with him to stop that future by betting on decentralized AI and founding Pluralis Research.







Saw a tweet the other day that said AI research is a max performance game. I think that’s right. Either you have the ability to solve the problem when you’re firing at your peak, or you can’t. You either clear the wall or you don’t. This was Sameera Ramasinghe's wall to clear. I



Here’s an accessible breakdown of Pluralis Research’s incredible paper. When we train large models on decentralized networks, the idea is to break them down into pieces and have different nodes process the different pieces. There are a few ways to do this. One way is low hanging

