Andrew Feldman (@andrewdfeldman) 's Twitter Profile
Andrew Feldman

@andrewdfeldman

CEO and Founder @CerebrasSystems. I like building teams that solve hard problems, dancing tango and Vizslas

ID: 4443830716

linkhttp://www.cerebras.net calendar_today11-12-2015 02:24:18

568 Tweet

2,2K Followers

199 Following

Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

At Cerebras our solutions are designed, manufactured and deployed in the USA. Here we are bringing up our fifth US data center to ensure that everyone has access to the fastest inference in the industry.

At <a href="/CerebrasSystems/">Cerebras</a> our solutions are designed, manufactured and deployed in the USA. Here  we are bringing up our fifth US data center to ensure that everyone has  access to the fastest inference in the industry.
Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

Building Data Centers....Cerebras has 5 data centers in the US...We power the fastest inference in the industry enabling all sorts of cool applications--Reasoning, Agentic, Voice to name a few. More to come...

Building Data Centers....<a href="/CerebrasSystems/">Cerebras</a> has 5 data centers in the US...We power the fastest inference in  the industry enabling all sorts of cool applications--Reasoning,  Agentic, Voice to name a few. More to come...
Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

Water-cooled data centers are more energy-efficient than air-cooled data centers. At Cerebras all our data centers are all water cooled, which is one of the reasons our systems consume so much less power per token than the competition

Water-cooled data  centers are more energy-efficient than air-cooled data centers. At <a href="/CerebrasSystems/">Cerebras</a>  all  our data centers are all water cooled, which is one of the reasons our  systems consume so much less power per token than the competition
Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

All the top models are reasoning models. And reasoning models use more tokens in inference. If your inference API is slow, even if your results are accurate customers complain or leave That’s why at Cerebras we build the fastest AI inference. Smart models. Fast inference

Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

"Technology is what you learn after you are 18” At a conference today, I heard this line. What a thoughtful observation. If you watch high school students with ChatGPT you realize that to them its not a technology. It just is the way the world works. There was nothing before it.

Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

"Mom I dont want to do AI when I grow up. You think AI is cool because you didnt have AI when you were my age. When I grow up I will find something cooler and thats what I will do" Said a 7 year old to her mother…(as reported to me)

Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

Building phenomenal data centers in the US...here is the water cooling infrastructure at Cerebras new facility. We deliver the fastest AI inference, from the most efficient data centers...

Building phenomenal data centers in the US...here is the water cooling infrastructure at <a href="/CerebrasSystems/">Cerebras</a> new facility. We deliver the fastest AI inference, from the most efficient data centers...
Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

building building building....Cerebras deploying massive capacity in US Data Centers. The fastest AI. Made in the USA. Served from US data centers.

building building building....<a href="/CerebrasSystems/">Cerebras</a> deploying massive capacity in US Data Centers. The fastest AI. Made in the USA. Served from US data centers.
Andrew Feldman (@andrewdfeldman) 's Twitter Profile Photo

Delivering the fastest AI through partners around the world. Cerebras are proud to partner with IBM and to deliver the industry's fastest inference for their watsonx.ai gateway.