Home > All news > Industry news > Micron Introduces HBM3E 12-high 36GB
芯达茂F广告位 芯达茂F广告位

Micron Introduces HBM3E 12-high 36GB

As AI workloads continue to evolve and scale, the bandwidth and capacity of memory become more critical to system performance. The latest GPUs have a growing demand for high-bandwidth memory (HBM), high-capacity memory, and greater energy efficiency. Micron Technology, a leader in memory technology innovation, has launched the production-ready HBM3E 12-high solution and has made the product available to key industry partners for certification to serve the entire AI ecosystem.

Micron's HBM3E 12-high 36GB product performs well in terms of power consumption, which is significantly lower than the 8-high 24GB product on the market, despite a 50% increase in DRAM capacity in the package. The new HBM3E 12-high solution has a capacity of 36GB, which is 50% more than the current HBM3E 8-high product, and is capable of running large AI models with up to 70 billion parameters (such as Llama 2) on a single processor. This increase in capacity avoids CPU offload and latency in communication between GPUs, significantly reducing time to insight.

Not only that, but the power consumption of the HBM3E 12-high 36GB is lower than that of comparable competitor HBM3E 8-high 24GB solutions. It is capable of delivering more than 1.2 TB/s of memory bandwidth and pins up to 9.2 Gb/s. These benefits of Micron's HBM3E products not only provide maximum throughput, but also achieve the lowest power consumption, bringing superior results to energy-intensive data centers.

In addition, Micron's HBM3E 12-high is equipped with a fully programmable MBIST feature that enables the system to run at full speed to simulate traffic, improving test coverage, reducing product validation time, and improving system reliability for faster time-to-market.

Ecosystem support

Micron is delivering production-ready HBM3E 12-high components to key industry partners for certification of the AI ecosystem. This marks a significant step forward for Micron in meeting the growing demand for AI infrastructure. Micron is also involved in TSMC's 3DFabric Alliance, which aims to advance the future of semiconductor and system innovation. The manufacturing of AI systems is extremely complex, and the integration of HBM3E requires close collaboration between memory vendors, customers, and outsourced semiconductor assembly and test (OSAT) companies.

Dan Kochpatcharin, head of TSMC's ecosystem and alliance management department, said in an exchange: "TSMC and Micron have maintained a long-term strategic partnership. As part of the OIP ecosystem, we work closely together to ensure that Micron's HBM3E-based systems and CoWoS package designs support our customers' AI innovations.”

Figure: Micron HBM3E

Highlights of the Micron HBM3E 12-high 36GB

Customer Certification: Micron has delivered its production-ready HBM3E 12-high products to key industry partners, ensuring they are certified across the entire AI ecosystem. 

Seamless Scaling: With 36GB of capacity, a 50% increase over existing HBM3E products, Micron's HBM3E 12-high easily scales the processing power of AI workloads in the data center.

Superior power efficiency: Micron's HBM3E 12-high 36GB consumes significantly less power than competing HBM3E 8-high 24GB solutions.

Superior performance: With pin speeds of over 9.2 Gb/s and memory bandwidth of over 1.2 TB/s, it meets the needs of AI accelerators, supercomputers, and data centers for fast data access.

Accelerated validation: Fully programmable MBIST capabilities enable the system to run at the speed of simulated traffic, improve test coverage, accelerate product validation and time-to-market, and improve the overall reliability of the system.

Future outlook

Micron is committed to meeting the changing needs of generative AI workloads through its leading portfolio of data center memory and storage products. Whether it's near-memory (HBM), main memory (high-capacity server RDIMM), or Gen5 PCIe NVMe SSDs and data lake SSDs, Micron offers market-leading solutions that help data centers efficiently and quickly respond to the growth of AI workloads.

Related news recommendations

Login

Register

Login
{{codeText}}
Login
{{codeText}}
Submit
Close
Subscribe
ITEM
Comparison Clear all