AMD, at its 2025 Advancing AI event, has unveiled its most ambitious AI platform vision yet — delivering a full-stack, open, and scalable infrastructure designed to redefine the future of AI computing.

The event spotlighted how AMD’s latest innovations, including the Instinct MI350 Series accelerators, ROCm software stack, and next-generation rack-scale platforms, are being rapidly adopted by global AI and cloud leaders to fuel transformative AI development.
AMD CEO Lisa Su said: “We are entering the next phase of AI, driven by open standards, shared innovation, and AMD’s expanding leadership.”
AMD Powers AI Ambitions of Global Tech Giants
Seven of the top 10 AI model builders now rely on AMD’s Instinct accelerators in production. Industry leaders such as Meta, OpenAI, Microsoft, Oracle Cloud Infrastructure, Cohere, Red Hat, and xAI are partnering with AMD to deliver state-of-the-art AI capabilities across training, inference, and large-scale AI infrastructure.
Meta: Fueling Llama Inference with MI300X and Preparing for MI350
Meta has deployed AMD’s Instinct MI300X GPUs for inference of its Llama 3 and Llama 4 models. The company is now preparing for the next leap with MI350, citing its compelling performance-to-cost ratio and advanced memory capabilities. Meta’s partnership with AMD includes joint roadmap planning, particularly around the upcoming MI400 Series.
OpenAI: Building on MI300X and Deep Co-Design with AMD
Sam Altman, CEO of OpenAI, emphasized the value of deeply integrated hardware-software-algorithm co-optimization. OpenAI is actively using AMD Instinct MI300X on Microsoft Azure for its production workloads and GPT models, while working closely with AMD on designing the MI400 Series platform for future needs.
Microsoft: AI Production Workloads on Azure with AMD
Microsoft confirmed the Instinct MI300X is now powering both proprietary and open-source models on Azure. This marks a critical validation of AMD’s ability to handle enterprise-grade inference at cloud hyperscale.
Oracle Cloud Infrastructure (OCI): Pioneering Rack-Scale AI with MI355X
OCI is one of the first to deploy AMD’s open, rack-scale AI infrastructure, using the Instinct MI355X GPUs. Oracle announced plans to offer zettascale AI clusters powered by up to 131,072 MI355X GPUs, supporting training and inference for some of the most demanding AI workloads.
Cohere: Enterprise LLM Inference at Scale
Cohere’s Command LLMs are now running on AMD Instinct MI300X, enabling enterprise-grade inference with a focus on high throughput and data privacy. The company’s use of AMD hardware highlights the growing trust in AMD for sensitive, high-efficiency applications.
Red Hat: Hybrid AI with OpenShift and AMD
Red Hat announced expanded collaboration with AMD to deliver robust AI environments via OpenShift AI. AMD Instinct GPUs provide the compute muscle across hybrid cloud deployments, enabling efficient and scalable AI processing.
A Platform Built for the Next Decade of AI
AMD’s newest Instinct MI350 Series GPUs deliver 4x AI compute performance and up to 35x inference performance over previous generations. The upcoming MI400 Series promises up to 10x better inference performance on Mixture of Experts (MoE) models. These hardware advancements are backed by a growing software ecosystem, including the ROCm 7 stack, which improves developer experience, framework support, and hardware compatibility.
The launch of AMD’s Helios rack, powered by MI400 GPUs, “Zen 6” EPYC CPUs, and Pensando NICs, underscores the company’s roadmap to dominate in energy-efficient, rack-scale AI performance through 2030.
An Open, Scalable Future with Strategic Collaborations
AMD is co-developing AI infrastructure with companies like Hugging Face, Grok, HUMAIN, and Astera Labs. These partnerships center around open standards, scalable architectures, and developer-first tooling — fundamentally reshaping how AI systems are built and deployed.
Conclusion: AMD Emerges as a Strategic AI Partner
From inference to model training, from hyperscale data centers to developer platforms, AMD is powering the world’s most advanced AI use cases. With a growing number of tech leaders embracing its GPUs and software ecosystem, AMD is not just a challenger in the AI race — it’s becoming a cornerstone for building the next generation of AI.
Rajani Baburajan