ollama (@ollama) 's Twitter Profile
ollama

@ollama

ollama.com

ID: 1688410127378829312

linkhttps://github.com/ollama/ollama calendar_today07-08-2023 04:42:08

5,5K Tweet

91,91K Followers

8 Following

Avanika Narayan (@avanika15) 's Twitter Profile Photo

[6/25] minions daily ship 🚢 🔨 ollama ❤️ mcp! we updated our ollama client so that you can integrate ollama tool calling with any Anthropic mcp server and….celebrating 1k ⭐’s on github! ty for all the support 😊

[6/25] minions daily ship 🚢

🔨 ollama ❤️ mcp! we updated our <a href="/ollama/">ollama</a> client so that you can integrate ollama tool calling with any  <a href="/AnthropicAI/">Anthropic</a> mcp server

and….celebrating 1k ⭐’s on github! ty for all the support 😊
Steren (@steren) 's Twitter Profile Photo

Serverless Gemma 3n with @Ollama and Cloud Run in two commands: $ gcloud run deploy --image ollama/ollama --port 11434 --gpu 1 $ OLLAMA_HOST=[...] ollama run gemma3n

Serverless Gemma 3n with @Ollama and Cloud Run in two commands:

$ gcloud run deploy --image ollama/ollama --port 11434 --gpu 1 

$ OLLAMA_HOST=[...]  ollama run gemma3n
Avanika Narayan (@avanika15) 's Twitter Profile Photo

[6/29] minions daily ship 🚢 🧑🏽‍🦰 character chat app! saw that role-play apps are taking off on OpenRouter — so we made it local-first. we 🚢’d a character chat app: ☁️ lms spin up the persona, local gemma3n (via ollama) drives the conversation on-device give it a whirl →

Kasey Zhang (@_weexiao) 's Twitter Profile Photo

It’s easy to fine-tune small models w/ RL to outperform foundation models on vertical tasks. We’re open sourcing Osmosis-Apply-1.7B: a small model that merges code (similar to Cursor’s instant apply) better than foundation models. Links to download and try out the model below!

Avanika Narayan (@avanika15) 's Twitter Profile Photo

[7/4] minions daily ship 🚢 📖 story time! happy 4th 🇺🇸 — shipping a fun one today: an app where a remote lm and local ollama lm team up to write children’s stories! image gen powered by the <12b param flux model from Black Forest Labs check it out → apps/minion-story-teller

[7/4] minions daily ship 🚢

📖 story time! happy 4th 🇺🇸 — shipping a fun one today: an app where a remote lm and local <a href="/ollama/">ollama</a> lm team up to write children’s stories! image gen powered by the &lt;12b param flux model from <a href="/bfl_ml/">Black Forest Labs</a>

check it out → apps/minion-story-teller
Wes Bos (@wesbos) 's Twitter Profile Photo

Hot tip for anyone doing AI dev: Use Ollama to easily run models like Deepseek-r1 or Gemma locally on your machine. It downloads them and spins up a server with an OpenAI SDK compatible API The smaller models are fast and good enough to work on new features or debug streaming

Dmitry Lyalin (@lyalindotcom) 's Twitter Profile Photo

Ladies and gentleman a fully functional Next.js app you can test locally, powered by Genkit framework + ollama and local models. Works with various vision models. I tested it with LLaVA and Gemma 3 already, good results. Source code to follow.

Ladies and gentleman a fully functional Next.js app you can test locally, powered by Genkit framework + <a href="/ollama/">ollama</a> and local models. Works with various vision models. I tested it with LLaVA and Gemma 3 already, good results.

Source code to follow.