OpenAI announced on April 28 that its models, including GPT-5.5, Codex, and managed agents, are now available on Amazon Web Services (AWS) following the end of its exclusive cloud arrangement with Microsoft.
The rollout, currently in limited preview, allows AWS customers to access OpenAI models, including GPT-5.5, through Amazon Bedrock, alongside tools for software development and agent-based workflows.
The company said the integration allows enterprises to deploy AI within existing AWS infrastructure, security, and compliance systems.
“These capabilities give organisations more ways to use OpenAI across application development, software engineering, and agentic workflows—while building within the infrastructure, security, governance, and procurement workflows they already use on AWS,” OpenAI said in a statement.
The announcement comes a day after OpenAI restructured its long-standing partnership with Microsoft, ending a deal that had made Azure its exclusive cloud provider. Under the revised agreement, OpenAI can now distribute its models across multiple cloud platforms, including AWS and Google Cloud.
Microsoft will continue as a key partner with non-exclusive licensing rights through 2032, but the changes remove restrictions that previously limited OpenAI’s commercial expansion.
OpenAI models will be accessible via Amazon Bedrock, allowing customers to build applications using existing AWS services and governance systems. The company said this provides a “single path from experimentation to production” for enterprises already operating on AWS.
The partnership also introduces Codex on AWS. More than four million users access Codex weekly, according to OpenAI, using it for tasks such as code generation, testing, and documentation.
With Bedrock integration, enterprises can deploy Codex within AWS environments using tools such as the CLI, desktop app, and Visual Studio Code extension.
In addition, Amazon Bedrock Managed Agents powered by OpenAI will allow businesses to deploy AI agents capable of multi-step workflows, tool usage, and task execution across enterprise systems.
The AWS expansion reflects a broader move in OpenAI’s strategy to diversify cloud partnerships and scale enterprise adoption. Amazon has committed up to $50 billion to OpenAI as the startup plans to use about two gigawatts of AWS’s custom Trainium chips to train its AI models.
Microsoft’s earlier arrangement had tied OpenAI’s commercial deployments closely to Azure, but the new terms now allow it to pursue wider distribution and revenue opportunities.
The move positions AWS more directly against Azure in the AI infrastructure market, where cloud providers are competing to host and distribute leading AI models.
ALSO READ: The Playground is Closed: 10 Hard Truths from the Cisco AI Summit
