OpenAI expands reach with models now accessible on AWS platforms

Developers can deploy agentic AI securely, use chain-of-thought insights, and adapt model behavior across cloud environments.

OpenAI AWS integration, gpt‑oss‑120b, gpt‑oss‑20b, Bedrock AI models, SageMaker JumpStart, open‑weight reasoning models, cloud AI competition, agentic AI workflows, chain‑of‑thought LLMs, AWS OpenAI partnership

Amazon Web Services (AWS) now offers access to OpenAI’s gpt‑oss‑120b and gpt‑oss‑20b models through both Amazon Bedrock and SageMaker JumpStart. Bedford’s unified API lets developers experiment and switch models without rewriting code, while SageMaker offers fine‑tuning, deployment pipelines, and robust enterprise controls.

AWS CEO Matt Garman celebrated the partnership as a ‘powerhouse combination’, noting that the models outperform comparable options, claiming they are three times more price-efficient than Gemini and five times more than DeepSeek‑R1, when deployed via Bedrock.

Rich functionality comes with these models: wide context capacity, chain-of-thought transparency, adjustable reasoning levels, and compatibility with agentic workflows. Bedrock offers secure deployment with Guardrails support, while SageMaker enables experimentation across AWS regions.

Financial markets took notice. AWS stock rose after the announcement, as analysts viewed the pairing with OpenAI’s open models as a meaningful step toward boosting its AI offerings amid fierce cloud rivalry.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!