Micron Technology has announced the initiation of volume production for its HBM3E (High Bandwidth Memory 3E) solution.
This memory technology will feature in NVIDIA H200 Tensor Core GPUs, slated to hit the market in the second quarter of 2024. Micron’s advancement places it at the forefront of the industry, poised to bolster artificial intelligence (AI) solutions with HBM3E’s unparalleled performance and energy efficiency.
Micron’s latest innovation promises:
Superior Performance: Boasting pin speeds exceeding 9.2 gigabits per second (Gb/s), Micron’s HBM3E delivers over 1.2 terabytes per second (TB/s) of memory bandwidth. This capability ensures rapid data access, catering to the demands of AI accelerators, supercomputers, and data centers.
Exceptional Efficiency: Micron’s HBM3E leads the pack with approximately 30% lower power consumption compared to rival offerings. This feature addresses the burgeoning demand and usage of AI, offering maximum throughput with minimal power consumption, thereby enhancing crucial data center operational expense metrics.
Seamless Scalability: Initially offering a capacity of 24 GB, Micron’s HBM3E facilitates seamless scalability for AI applications within data centers. Whether for training expansive neural networks or accelerating inferencing tasks, Micron’s solution provides the necessary memory bandwidth.
Sumit Sadana, Chief Business Officer at Micron Technology, said: “Micron is delivering a trifecta with this HBM3E milestone: time-to-market leadership, best-in-class industry performance, and a differentiated power efficiency profile.”
Sumit Sadana has emphasized the crucial role of memory bandwidth and capacity in AI workloads and underscored Micron’s preparedness to support the anticipated growth in AI through its industry-leading HBM3E and HBM4 roadmap, alongside its comprehensive portfolio of DRAM and NAND solutions tailored for AI applications.
Micron’s development of this groundbreaking HBM3E design leverages its 1-beta technology, advanced through-silicon via (TSV), and other innovative packaging solutions. As a proven leader in memory for 2.5D/3D-stacking and advanced packaging technologies, Micron collaborated with TSMC’s 3DFabric Alliance, contributing to the evolution of semiconductor and system innovations.
Micron is expanding its leadership by introducing the sampling of 36GB 12-High HBM3E in March 2024. This next-generation solution is expected to deliver performance exceeding 1.2 TB/s while maintaining superior energy efficiency compared to competing offerings.
Micron will showcase its AI memory portfolio and roadmaps as a sponsor at NVIDIA GTC, a global AI conference commencing on March 18.