Micron Technology began volume shipments of its first HBM4 memory module, a 36‑GB 12‑high configuration, on March 16 2026. The launch marks the company’s entry into the high‑bandwidth memory market with a product that delivers more than 2.8 TB/s of bandwidth and 20 % better power efficiency than the preceding HBM3E.
The 36‑GB 12‑high module offers a 2.3‑fold increase in bandwidth over HBM3E and a 33 % higher capacity per placement compared with the 48‑GB 16‑high sample. These technical advantages position Micron to supply next‑generation NVIDIA Vera Rubin GPUs and PCIe Gen6 SSDs, reinforcing its strategy to shift from commodity DRAM to high‑margin AI‑focused products.
Micron’s HBM4 launch comes at a time when the high‑bandwidth memory market is expanding rapidly. The company’s 2026 HBM capacity is already sold out, and demand from data‑center and AI workloads continues to grow. The move to HBM4 is expected to drive margin expansion, as the product’s higher price point and improved efficiency translate into stronger profitability.
"The next era of AI will be defined by tightly integrated platforms developed through joint engineering innovations across the ecosystem. Our close collaboration with NVIDIA ensures that compute and memory are designed to scale together from day one," said Micron executive vice president and chief business officer Sumit Sadana.
"At the heart of this is Micron's HBM4, the engine of AI, delivering unprecedented bandwidth, capacity and power efficiency. With HBM4 36GB 12H, alongside the industry's first SOCAMM2 and Gen6 SSD now in high‑volume production, Micron's memory and storage form a core foundation that unlocks the full potential of next‑generation AI," added Sadana.
The content on EveryTicker is for informational purposes only and should not be construed as financial or investment advice. We are not financial advisors. Consult with a qualified professional before making any investment decisions. Any actions you take based on information from this site are solely at your own risk.