Get the latest tech news
Cerebras just announced 6 new AI datacenters that process 40M tokens per second — and it could be bad news for Nvidia
Cerebras Systems is challenging Nvidia with six new AI data centers across North America, promising 10x faster inference speeds and 7x cost reduction for companies using advanced AI models like Llama 3.
The AlphaSense partnership represents a significant enterprise customer win, with the financial intelligence platform switching from what Wang described as a “global, top three closed-source AI model vendor” to Cerebras. With 85% of its inference capacity located in the United States, Cerebras is also positioning itself as a key player in advancing domestic AI infrastructure at a time when technological sovereignty has become a national priority. For technical decision makers evaluating AI infrastructure options, Cerebras’ expansion represents a significant new alternative to GPU-based solutions, particularly for applications where response time is critical to user experience.
Or read this on Venture Beat