
OpenAI has unveiled two new open-weight reasoning models — boasting capabilities on par with its elite o-series — and Amazon wasted no time swooping in. On Tuesday, AWS confirmed to TechCrunch that these models will, for the very first time, be available on its cloud. That’s right — OpenAI models on AWS.
These models will be accessible via Amazon Bedrock and SageMaker AI, meaning AWS customers can now build, host, and train apps powered by OpenAI’s tech — directly on Amazon’s turf. And while anyone can technically download the models from Hugging Face, this AWS rollout comes with OpenAI’s full blessing, as confirmed by Dmitry Pimenov, OpenAI’s product lead. Think of it like when Amazon offered DeepSeek-R1 earlier this year — but with a much spicier competitive twist.
Why spicy? Because this is a statement move. For AWS, it’s a long-overdue ticket to sit at the same AI table as Microsoft’s prized partner, OpenAI. Until now, AWS was mostly known for backing Anthropic’s Claude and hosting models from Meta, Cohere, Mistral, and DeepSeek. Meanwhile, Microsoft has been raking in cloud business thanks to its close partnership with OpenAI — a fact that has been a sore spot for Amazon CEO Andy Jassy.
In fact, during last week’s earnings call, Wall Street analysts grilled Jassy about AWS “falling behind” in generative AI, especially compared to Microsoft and Google. Jassy fired back, noting AWS’s size advantage — but the competitive itch was clear.
And now? AWS finally gets a slice of the OpenAI pie, while OpenAI benefits by diversifying beyond Microsoft — a relationship reportedly under renegotiation. This move also opens the door for thousands of AWS enterprise customers to integrate OpenAI into their workflows easily.
As a bonus, Sam Altman gets to subtly jab Meta’s Mark Zuckerberg — because while OpenAI just open-sourced these models under Apache 2.0, Meta has hinted it won’t keep all future “superintelligence” models open.