Micron CEO Sanjay Mehrotra has revealed how the semiconductor company’s AI-focused innovations have helped customer wins and revenue growth.

Revenue
Micron has reported fiscal Q2 revenue of $8.1 billion, down 8 percent sequentially but up 38 percent year over year.
DRAM revenue reached $6.1 billion, up 47 percent year over year and comprising 76 percent of total revenue, with a 4 percent sequential decline due to high single-digit percentage decreases in bit shipments and mid-single-digit percentage increases in prices.
NAND revenue totaled $1.9 billion, up 18 percent year over year and representing 23 percent of total revenue, but down 17 percent sequentially, as bit shipments increased modestly while prices declined in the high-teens percentage range.
Compute and networking business unit revenue grew 4 percent sequentially to $4.6 billion, accounting for 57 percent of total revenue, achieving a third consecutive record quarter with a more than 50 percent sequential increase in HBM revenue.
The storage business unit generated $1.4 billion, down 20 percent sequentially, impacted by lower data center storage investments and NAND pricing trends.
Mobile business unit revenue declined 30 percent sequentially to $1.1 billion as smartphone OEMs worked through inventory adjustments.
Embedded business unit revenue stood at $1 billion, down 3 percent sequentially, primarily due to inventory optimization efforts by automotive customers.
Capex
Micron’s Q2 capital expenditures were $3.1 billion. In fiscal Q3, Micron forecast capex to be over $3 billion. Capex projection for fiscal 2025 remains approximately $14 billion. The majority of the fiscal 2025 capex is to support HBM, as well as facility construction, back-end manufacturing, and R&D investments.
Micron continues disciplined AI-driven investments, focusing on HBM capacity expansion through 2026. A new HBM packaging facility in Singapore, set to expand capacity in 2027, and the Idaho DRAM fab, expected to contribute meaningful output by fiscal 2027, underscore its long-term commitment to AI memory solutions.
AI
Micron achieved record-breaking AI-driven advancements in fiscal Q2, with data center DRAM revenue reaching an all-time high. HBM revenue grew over 50 percent sequentially, surpassing $1 billion in quarterly revenue. The combined revenue from high-capacity DRAM modules and LPDRAM for data centers also exceeded $1 billion.
It remains the sole company shipping low-power DRAM in high volume for data centers. Fiscal Q3 is expected to set another revenue record, driven by AI-driven data center demand and HBM ramp-up, leading to supply constraints in non-HBM DRAM, Micron CEO Sanjay Mehrotra said.
Micron is uniquely positioned to capitalize on the transformative growth driven by AI from data center to edge devices, Sanjay Mehrotra said.
Micron’s 1-beta DRAM leads the industry, and the launch of its 1-gamma node extends this leadership with the industry’s first 1-gamma-based D5 shipments. The 1-gamma DRAM, featuring EUV technology, delivers 20 percent lower power consumption, 15 percent higher performance, and over 30 percent improvement in bit density over 1-beta.
Advancements in computation hardware are reducing the cost of generative AI, expanding AI adoption across applications. High-performance AI processors require HBM memory for optimal efficiency, and Micron’s HBM technology is recognized as the industry leader.
Hyperscale customers project strong AI infrastructure investment in 2025, with Micron increasing its HBM TAM estimate for 2025 to over $35 billion. The company expects to match its overall DRAM market share in HBM by Q4 2025, with all 2025 HBM output already sold out and strong 2026 demand.
Micron’s HBM3E offers a 30 percent power advantage over competitors, with the 12-high variant delivering 20 percent better power efficiency and 50 percent greater memory capacity than 8-high alternatives. Volume production of HBM3E 12-high is underway, with shipments expected to dominate the latter half of 2025.
Micron’s HBM3E 8-high is integrated into NVIDIA’s GB200 system, while the 12-high variant is featured in the GB300. The company began volume shipments to its third large HBM3E customer in fiscal Q2 and expects multibillion-dollar HBM revenue in fiscal 2025. Looking ahead, Micron’s HBM4, launching in 2026, will provide over 60 percent bandwidth improvement over HBM3E.
Micron continues to lead the adoption of LPDRAM in AI servers, reducing memory power consumption by two-thirds compared to D5. Its SOCAMM, developed in collaboration with NVIDIA for the GB300, enhances server manufacturability and serviceability, driving broader LP adoption.
The company is set to generate multibillion-dollar revenue in fiscal 2025 from high-capacity D5 modules and LPDRAM products. In data center NAND, short-term inventory impacts moderated demand in fiscal Q2, but shipment growth is anticipated in the coming months.
AI PCs now require a minimum of 16GB of DRAM, exceeding last year’s 12GB average, and Micron has sampled its 16Gb 1-gamma-based D5 products to PC clients.
In NAND, Micron launched the world’s fastest client SSDs, the Gen9-based 4600 performance SSDs, and completed multiple OEM qualifications for its 2650 mainstream SSDs. AI-driven advancements in mobile are accelerating demand for higher-capacity DRAM, with AI-capable flagship smartphones now featuring 12GB or more of DRAM compared to last year’s 8GB.
Micron’s 9.6Gbps LP5X DRAM enhances AI performance, delivering 20 percent more tokens per second than legacy speed grades. The Samsung Galaxy S25 Series features Micron’s LP5X DRAM and UFS 4.0 NAND, and the company has begun sampling the industry’s first mobile G9-managed NAND-based UFS 4.1 solution in 1TB densities.
AI adoption in automotive is driving significant increases in memory and storage content per vehicle. Advanced robotaxi platforms now require over 200GB of DRAM, 20x to 30x higher than the average car. Micron is well-positioned to capitalize on this trend with its industry-leading automotive memory portfolio.
The company announced the production readiness of the industry’s first automotive LP5X DRAM supporting 9.6Gbps speeds, meeting AI-driven in-vehicle performance demands. Additionally, Micron’s 4150 SSD became the first enterprise SSD to receive automotive qualification and is now sampling with customers.
Micron’s investment in AI-driven advancements is driving AI-enabled solutions across PCs, mobile, automotive, and data center markets while reinforcing its leadership in cutting-edge memory and storage technologies.
Baburajan Kizhakedath