SK Hynix CEO Kwak Noh-Jung announced a new vision for the company — becoming a “Full Stack AI Memory Creator” — during the SK AI Summit 2025 held in Seoul on November 3.

Kwak Noh-Jung said SK Hynix has been serving as a “Full Stack Memory Provider,” delivering solutions aligned with customer needs and timelines. Moving forward, the company aims to exceed customer expectations through collaboration within the AI ecosystem. “We will become a creator who builds Full Stack AI Memory as a co-architect, partner, and eco-contributor,” Kwak Noh-Jung said.
SK Hynix has become both the global leader in the memory sector and the top-ranked company to work for, symbolized by the number “1.”
Semiconductor Memory in the AI Era
Kwak Noh-Jung highlighted that as AI adoption accelerates, data traffic continues to surge, demanding rapid hardware innovation. However, memory performance has lagged behind processor advancements, creating the so-called “Memory Wall.”
In the AI-driven world, semiconductor memory is no longer just a supporting component but a core value product. Its performance requirements are increasing sharply, pushing for new architectural and collaborative approaches.
Transition to ‘Full Stack AI Memory Creator’
Until now, SK Hynix’s strategy focused on supplying products with time-to-market precision, establishing its position as a “Full Stack AI Memory Provider.” The company plans to go beyond supplying products to actively creating new memory architectures that address customers’ performance and efficiency challenges.
As a “Full Stack AI Memory Creator,” SK Hynix will collaborate with partners across the ecosystem to design and deliver advanced memory solutions tailored for next-generation AI computing.
Full Stack AI Memory Lineup
SK Hynix introduced its upcoming AI-focused memory lineup that includes Custom HBM, AI DRAM (AI-D), and AI NAND (AI-N):
Custom HBM – Developed to maximize GPU and ASIC performance, Custom HBM integrates specific processing functions into the memory base, reducing data transfer power consumption and enhancing system efficiency.
AI-D (AI DRAM) – The company is developing three AI-D solutions:
AI-D O (Optimization) – Low-power, high-performance DRAM that reduces total cost of ownership and improves operational efficiency.
AI-D B (Breakthrough) – Ultra-high-capacity DRAM designed to overcome the Memory Wall with flexible allocation and advanced interfaces like CMM and PIM.
AI-D E (Expansion) – Expands DRAM use into robotics, mobility, and industrial automation.
AI-N (AI NAND) – The AI NAND series includes:
AI-N P (Performance) – Ultra-high-performance storage optimized for large-scale AI workloads.
AI-N B (Bandwidth) – Increases bandwidth through vertically stacked NAND structures.
AI-N D (Density) – High-density, power-efficient storage aimed at scaling from terabyte to petabyte levels.
Strengthening Global Partnerships
SK Hynix emphasized collaboration as key to thriving in the AI era. The company is partnering with NVIDIA on HBM and digital twin technology using NVIDIA Omniverse, and working with OpenAI to supply high-performance memory. It is also collaborating with TSMC on next-generation HBM base dies, Sandisk on High Bandwidth Flash standards, and NAVER Cloud on optimizing AI memory for data centers.
Shaping the Future of AI Memory
Kwak Noh-Jung concluded that companies fostering strong partnerships and innovation will lead the AI memory industry. “SK Hynix will continue to prioritize customer satisfaction and technological collaboration to overcome limitations and pioneer the future,” he said.

