Samsung and AMD Partner on Next-Gen HBM4 AI Memory

Samsung and AMD Partner on Next-Gen HBM4 AI Memory

Share

Samsung Electronics and AMD have signed a new agreement to co-develop next-generation High Bandwidth Memory (HBM4), targeting the rapidly growing AI and data center market.

What the Deal Covers

The partnership focuses on advancing HBM4 memory, which is expected to deliver significantly higher bandwidth, improved power efficiency, and better thermal performance compared to current HBM3 and HBM3E standards.

HBM is a critical component in AI accelerators and GPUs, enabling faster data transfer between memory and compute units. With AI workloads increasing in complexity, memory performance has become a key bottleneck.

The partnership focuses on advancing HBM4 memory, which is expected to deliver significantly higher bandwidth, improved power efficiency, and better thermal performance compared to current HBM3 and HBM3E standards.HBM is a critical component in AI accelerators and GPUs, enabling faster data transfer between memory and compute units. With AI workloads increasing in complexity, memory performance has become a key bottleneck.

Why HBM4 Matters

HBM4 is expected to push memory bandwidth beyond current limits, supporting next-generation AI models, large-scale training systems, and high-performance computing environments.

Key expected improvements:

  • Higher data transfer speeds
  • Lower latency
  • Improved energy efficiency
  • Increased stacking capacity

This makes HBM4 essential for future AI chips, especially as demand for generative AI and large language models continues to scale.

Strategic Impact

Samsung is aiming to strengthen its position in the HBM market, where it faces strong competition from SK Hynix, currently a leading supplier of HBM used in AI GPUs.

For AMD, the collaboration ensures early access to next-gen memory technology, which can be integrated into its future AI accelerators and data center GPUs.

The deal also signals a deeper collaboration between memory manufacturers and chip designers, as AI hardware development becomes more tightly integrated.

Industry Context

The semiconductor industry is seeing a shift where memory is no longer just a supporting component but a critical performance driver for AI systems.

As companies like NVIDIA, AMD, and others push for more powerful AI chips, the demand for advanced memory solutions like HBM4 is accelerating.

At the same time, supply constraints and competition in the HBM segment are intensifying, making partnerships like this strategically important.

What Comes Next

HBM4 is still under development, with commercialization expected in the coming years. Early adoption will likely be seen in high-end AI accelerators and data center hardware.

The Samsung–AMD collaboration positions both companies to compete more aggressively in the AI infrastructure space as demand continues to surge.


Source: Reuters – Samsung Electronics and AMD sign MoU on AI memory and explore foundry partnership

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top