Amazon Web Services (AWS) and OpenAI have announced a multi-year partnership worth $38 billion, marking one of the largest cloud computing deals in history. Under the seven-year agreement, OpenAI will use AWS’s advanced cloud infrastructure to run and scale its artificial intelligence (AI) workloads, including training and deploying large language models such as ChatGPT.

The partnership will give OpenAI access to hundreds of thousands of NVIDIA GPUs through AWS, with capacity to expand to tens of millions of CPUs to handle large-scale AI tasks. AWS will begin deploying this compute infrastructure immediately, with full deployment expected by the end of 2026 and the potential to grow further into 2027 and beyond.
According to AWS, the custom-built architecture — featuring NVIDIA GB200 and GB300 GPUs connected via Amazon EC2 UltraServers — will deliver high-performance, low-latency computing power optimized for AI workloads. AWS’s cloud infrastructure will enable OpenAI to efficiently train new models, serve ChatGPT inference workloads, and adapt to future AI demands.
“Scaling frontier AI requires massive, reliable compute,” said OpenAI CEO Sam Altman. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
AWS CEO Matt Garman said the partnership highlights AWS’s leadership in delivering secure, scalable AI infrastructure. “As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” he said.
The deal underscores the growing demand for cloud computing power as AI development accelerates. Analysts see it as a major boost for AWS, which has faced increasing competition from Microsoft Azure and Google Cloud. Paolo Pescatore, analyst at PP Foresight, called it “a hugely significant deal and a strong endorsement of AWS’s compute capabilities to deliver the scale needed to support OpenAI.”
The agreement follows OpenAI’s recent restructuring that provided greater operational and financial independence from Microsoft, its long-time cloud partner. The new structure has allowed OpenAI to pursue multiple cloud relationships, including partnerships with Google Cloud and Oracle for additional compute resources, Reuters news report said.
Amazon’s stock surged following the announcement, rising nearly five percent and adding about $140 billion to its market value. The partnership comes as OpenAI targets an annualized revenue run rate of $20 billion by the end of 2025, though the company continues to invest heavily in expanding its computing capacity.
Earlier this year, OpenAI’s open weight foundation models became available on Amazon Bedrock, AWS’s managed service offering access to leading AI models. Thousands of companies, including Peloton, Thomson Reuters, Comscore, and Verana Health, are already using OpenAI’s models through Bedrock for applications ranging from coding to scientific analysis.
Altman has previously outlined an ambitious goal for OpenAI to add one gigawatt of compute capacity every week — a scale that could cost more than $40 billion per gigawatt. The AWS partnership is expected to play a central role in that vision, providing the infrastructure foundation for OpenAI’s next generation of frontier models and agentic AI systems.
Rajani Baburajan

