Samsung Intros 12-Stack HBM3E DRAM, Setting Standard in Memory Technology

Samsung Electronics has announced the launch of HBM3E 12H, the world’s foremost 12-stack HBM3E DRAM and the most capacious HBM product ever introduced.
Samsung HBM3E DRAMSamsung HBM3E 12H offers 1,280 gigabytes per second (GB/s), alongside an industry-leading capacity of 36 gigabytes (GB). In a significant leap forward, both metrics have surged by over 50 percent compared to the preceding 8-stack HBM3 8H.

“HBM3E 12H will be solidifying Samsung’s commitment to pioneering core technologies for high-stack HBM and maintaining technological supremacy in the high-capacity HBM market, especially in the era of AI,” Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics, said.

The HBM3E 12H integrates thermal compression non-conductive film (TC NCF) technology, ensuring that the 12-layer configuration meets the height specifications of its 8-layer counterparts, thereby complying with existing HBM package requirements. This innovative approach addresses chip die warping associated with thinner die in higher stacks.

Notably, Samsung has achieved a remarkable feat by reducing the thickness of its NCF material, resulting in the industry’s narrowest gap between chips at seven micrometers (µm), while concurrently eliminating voids between layers, thereby enhancing vertical density by over 20 percent compared to its HBM3 8H offering.

Moreover, Samsung’s advanced TC NCF technology enhances the thermal properties of HBM, facilitating the utilization of bumps in various sizes between chips. During the chip bonding process, smaller bumps are strategically placed in signaling areas, while larger ones are allocated to spots necessitating heat dissipation. This meticulous approach not only augments product yield but also contributes to improved thermal management.

With AI applications burgeoning exponentially, the HBM3E 12H is poised to emerge as the go-to solution for future systems requiring expanded memory capacities. Its heightened performance and capacity will empower customers to navigate resource allocation more flexibly, consequently reducing the total cost of ownership (TCO) for data centers.

Adoption of the HBM3E 12H in AI applications is projected to boost AI training speeds by 34 percent on average compared to the utilization of HBM3 8H and expand the number of simultaneous users of inference services by over 11.5 times.

Samsung, a leader in memory technology, has initiated the sampling phase for its HBM3E 12H, with mass production slated for the first half of the current year.

Related News

Latest News

Latest News