Micron begins volume production of new chip for AI workloads

Micron Technology, a semiconductor leader, has announced the volume production of its HBM3E solution, reducing data centre operating costs by 30%. The 24GB 8H HBM3E will be part of Nvidia H200 Tensor Core GPUs, starting shipments in Q2 of 2024.

Sumit Sadana, EVP and chief business officer at Micron Technology, highlighted the importance of memory bandwidth for AI workloads. The new HBM3E solution offers over 1.2 TB/s memory bandwidth with a pin speed greater than 9.2 Gb/s, catering to AI accelerators, supercomputers, and data centres.

Micron’s HBM3E solution sets the industry standard with 30% lower power consumption compared to its competitors. The company is also introducing a 36GB 12-High HBM3E, promising superior performance and energy efficiency in comparison to similar products, with sampling set for March this year.

Exit mobile version