Samsung Electronics has successfully passed Nvidia's rigorous testing for its 8-layer HBM3E memory chips, positioning itself as a key supplier for the rapidly growing AI chip industry. HBM, or High Bandwidth Memory, is a specialized type of DRAM designed to handle massive amounts of data at lightning speeds. It is a crucial component for powering the complex computations required for artificial intelligence applications. HBM3E, the latest iteration, offers even higher performance and energy efficiency than HBM3.
Securing Nvidia's approval was a hurdle for Samsung, which had previously faced challenges related to heat and power consumption in its HBM chips. The company has since addressed these issues to meet Nvidia's standards, as AI applications become more demanding. Samsung, along with companies like SK Hynix and Micron, is racing to meet this demand.
While Samsung's 12-layer HBM3E chips are still under evaluation, the approval of the 8-layer version is a step forward for the company regardless. Nvidia alsorecently cleared Samsung's fourth-generation high bandwidth memory chips, HBM3, for the first time. However, Reuters previously stated that the chips will likely be only used in China-specific Nvidia graphic cards.
Disclaimer: All resources provided are partly from the Internet. If there is any infringement of your copyright or other rights and interests, please explain the detailed reasons and provide proof of copyright or rights and interests and then send it to the email: [email protected] We will handle it for you as soon as possible.
Copyright© 2022 湘ICP备2022001581号-3