
The race to power the future of Artificial Intelligence is heating up, and at the heart of this revolution lies High Bandwidth Memory (HBM). With AI models demanding ever-increasing data throughput, the next generation of HBM Samsung HBM4 — is poised to redefine performance benchmarks. At the recent SEDEX 2025 exhibition, Samsung officially unveiled its HBM4 technology and signaling an aggressive push to reclaim market leadership and challenge rivals SK Hynix and Micron.
SEDEX 2025: The Battleground for AI Memory
27th Semiconductor Exhibition (SEDEX 2025) in Seoul served as the epicenter for the latest advancements in memory technology. Both Samsung Electronics and SK Hynix showcased their 6th-generation HBM4 prototypes, setting the stage for an intense competition. This event underscored HBM4 as a critical component for the upcoming wave of AI accelerators and high-performance computing.
Samsung HBM4: An Ambitious Bid for Dominance
After facing stiff competition in the HBM3E segment, Samsung is pulling out all the stops with its HBM4 offering. Their strategy is clear: leverage cutting-edge manufacturing, superior performance, and competitive pricing to win over key clients.
Key Specifications of Samsung HBM4:
- 12-Layer Stack: Samsung showcased a robust 12-layer HBM4 stack, demonstrating its capability in advanced vertical integration.
- Blazing Speeds: The showcased Samsung HBM4 modules boast an impressive 11 Gigabits per second (Gbps) pin speed. This high speed is crucial for meeting the stringent demands of next-gen AI platforms, including NVIDIA’s “Rubin” accelerators.
- Unprecedented Bandwidth: With a maximum bandwidth of 2.8 Terabytes per second (TB/s) per stack, Samsung HBM4 promises unparalleled data throughput, essential for feeding data-hungry AI processors.
- Advanced Manufacturing:
- 4nm Logic Die: A significant differentiator for Samsung HBM4 is its internally manufactured 4nm foundry process for the logic die. Achieving over 90% yield at this advanced node signifies stability and readiness for mass production.
- 1c DRAM Process: The memory chips themselves are built using Samsung’s 10-nanometer class, 6th-generation (1c) process, showcasing continuous innovation in DRAM technology.
Samsung’s aggressive stance includes aims for early adoption by offering competitive pricing and scaling up production capacity rapidly. The company is actively pursuing NVIDIA’s approval for its HBM4 supply, a critical endorsement in the high-stakes AI market.
The Fierce Competition: SK Hynix and Micron
While Samsung HBM4 makes its grand entrance, the landscape is far from empty. SK Hynix and Micron are formidable contenders, each bringing their own strengths to the HBM4 arena.
SK Hynix: The Current Market Leader Pushes Boundaries
SK Hynix, currently holding the largest share of the HBM market, is not resting on its laurels.
- 16-Layer Stack: At SEDEX 2025, SK Hynix unveiled a remarkable 16-layer HBM4 stack, pushing the limits of vertical integration and maximizing memory density.
- Robust Performance: Their HBM4 modules deliver 8 Gbps pin speeds, aligning with the JEDEC industry standard, and offer 2 TB/s of bandwidth per stack.
- Strategic Partnerships: SK Hynix is collaborating with TSMC for its HBM4 logic die (using a 12nm-class process), leveraging external foundry expertise. They continue to rely on their proven 10-nanometer class, 5th-generation (1b) DRAM process and advanced MR-MUF (Mass Reflow Molded Underfill) stacking technology.
Micron: The Rising Challenger
Micron has quietly ascended to become the second-largest HBM supplier, demonstrating significant momentum.
- High-Performance Matching: Micron has already shipped samples of its 12-high HBM4, with modules reportedly achieving pin speeds over 11 Gbps and delivering over 2.8 TB/s of bandwidth. This directly matches the peak performance figures touted by Samsung.
- Market Demand: Micron’s entire HBM output for 2025 is reportedly sold out, with much of its 2026 capacity already pre-booked, highlighting strong customer confidence and demand.
The Road Ahead for HBM4
The HBM4 market is shaping up to be a dynamic and highly competitive space. With the insatiable demand from AI applications, each manufacturer is striving to offer superior performance, efficiency, and scalability.
Samsung with its ambitious performance targets, advanced manufacturing processes, and aggressive market strategy, is undoubtedly a key player to watch. The coming years will reveal which company can best meet the evolving demands of the AI era and solidify its position at the forefront of high-bandwidth memory innovation.
Tags: Samsung HBM4, HBM4, AI Memory, High Bandwidth Memory, SK Hynix, Micron, SEDEX 2025, AI Accelerators, Memory Technology, Semiconductor, DRAM, NVIDIA

