
So, the AI chip wars just got way more interesting. Groq, the scrappy chip startup that’s been quietly building momentum, confirmed it has raised a jaw-dropping $750 million at a post-money valuation of $6.9 billion. Yep, billion with a B.
Now here’s why that matters: back in July, the whispers were that Groq was aiming for around $600M at a $6B valuation. But the company overshot expectations. Not bad for a startup that just last year (August 2024) raised $640M at a $2.8B valuation. In plain English? Groq more than doubled its worth in about 12 months. That’s startup glow-up energy.
But why are VCs throwing billions at Groq? One word: Nvidia. Nvidia’s GPUs basically run the AI world right now, and everyone else is scrambling to catch up. Groq’s secret sauce is that it’s not even playing the GPU game. Instead, it’s pushing LPUs (Language Processing Units) — chips purpose-built for inference (aka, the part where your AI actually answers you instead of just training). Groq likes to call it an “inference engine,” and it promises faster, cheaper AI without needing Nvidia’s golden tickets.
For devs and enterprises, Groq offers options: you can tap into their cloud service or go full hardware with server racks loaded with Groq’s integrated nodes. Both setups can run open-source heavy hitters like Meta’s models, Mistral, Google, OpenAI, and more. Translation: flexibility without crushing cloud bills.
And it’s not just hype. Groq says it’s now powering AI apps for 2 million+ developers (up from just 356K a year ago). That’s serious adoption.
The money behind this round? Heavyweights like Disruptive, BlackRock, Neuberger Berman, Deutsche Telekom Capital Partners, plus returning big names Samsung, Cisco, and Altimeter.
Oh, and fun fact: Groq’s founder, Jonathan Ross, literally helped invent Google’s TPU back in 2016. So yeah… the guy knows a thing or two about custom AI chips.