Get the latest tech news

Groq secures $640M to supercharge AI inference with next-gen LPUs


Groq raises $640M in Series D funding to transform AI inference with next-gen Language Processing Units (LPUs), aiming to become the largest non-hyperscaler AI compute provider by 2025.

Groq, a leader in AI inference technology, has raised $640 million in a Series D funding round, signaling a major shift in the artificial intelligence infrastructure landscape. The investment values the company at $2.8 billion and was led by BlackRock Private Equity Partners, with participation from Neuberger Berman, Type One Ventures, and strategic investors such as Cisco, KDDI, and Samsung Catalyst Fund. “We already have the orders in place with our suppliers, we are developing a robust rack manufacturing approach with ODM partners, and we have procured the necessary data center space and power to build out our cloud,” Pann said.

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of AI inference

AI inference

Photo of Groq

Groq

Photo of gen LPUs

gen LPUs

Related news:

News photo

Groq Raises $640M to Meet Soaring Demand for Fast AI Inference

News photo

AI chip startup Groq lands $640M to challenge Nvidia

News photo

Groq’s open-source Llama AI model tops leaderboard, outperforming GPT-4o and Claude in function calling