Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, or HBM3E, have passed Nvidia’s tests for use in its AI processors. This development marks an important milestone for Samsung as it aims to compete with local rival SK Hynix in the advanced memory chip market. While a supply deal for the approved eight-layer HBM3E chips is expected soon, Samsung’s 12-layer version has yet to pass Nvidia’s tests.
HBM chips, first introduced in 2013, are dynamic random access memory (DRAM) designed to save space and reduce power consumption. They are crucial for AI graphics processing units (GPUs) as they handle large volumes of data. Since last year, Samsung has been working to address heat and power consumption issues in its HBM3E design to meet Nvidia’s standards.
Despite Samsung’s recent progress, it still trails behind SK Hynix, which is already shipping 12-layer HBM3E chips. Samsung’s shares rose 4.3% on Wednesday, surpassing a 2.4% increase in the broader market. Nvidia’s approval of Samsung’s HBM chips comes amid growing demand for GPUs driven by the generative AI boom, with HBM3E chips expected to become mainstream this year.
Research firm TrendForce predicts HBM3E chips will dominate the market, with SK Hynix estimating an annual growth rate of 82% in HBM chip demand through 2027. Samsung aims for HBM3E chips to account for 60% of its HBM sales by the fourth quarter, a target analysts believe is achievable if Nvidia’s final approval is secured. SK Hynix, Micron, and Samsung primarily dominate the HBM market.
SEMI Europe, a leading semiconductor industry group, urged the EU to minimise restrictions on outbound investments in foreign chip technology. The EU is considering proposals to screen such investments, which could impact European funding in the global semiconductor, AI, and biotechnology sectors. However, no decisions are expected until 2025.
The US has already proposed rules to limit investments in China to protect national security and prevent the transfer of advanced technology. SEMI Europe argues that excessive restrictions could hinder European companies’ ability to invest and innovate, potentially compromising their competitive edge.
The organisation criticised the EU’s potential policies as too broad, suggesting they could force companies to reveal sensitive information and disrupt international research collaborations. SEMI Europe represents over 300 European semiconductor firms and institutions, including major players like ASML, Infineon, and STMicroelectronics.
In addition to outbound investment screening, the EU is advancing legislation to monitor foreign investments in critical European infrastructure and technology to address potential security risks.
Chinese tech giants, including Huawei and Baidu, and startups are stockpiling high bandwidth memory (HBM) semiconductors from Samsung Electronics in anticipation of potential US export restrictions. The ramped-up purchasing began earlier this year, with China accounting for about 30% of Samsung’s HBM chip revenue in the first half of 2024. This strategic plan reflects China’s efforts to maintain its technological ambitions amid increasing trade tensions with the US and other Western nations, impacting the global semiconductor supply chain.
US authorities will soon announce an export control package, including new shipment restrictions to China’s semiconductor industry. The new package of measures will likely detail limits on access to HBM chips, although specific details and potential impacts remain unclear.
HBM chips are essential for developing advanced processors, such as Nvidia’s graphics processing units, used for generative AI since only three bigger chipmakers, SK Hynix, Samsung, and US-based Micron Technology, produce these kinds of chips.
Chinese demand has focused on the HBM2E model, two generations behind the latest HBM3E. Due to the global AI boom, the advanced model is in short supply. Chinese companies, from satellite manufacturers to tech firms like Tencent, have purchased these chips. Huawei has used Samsung’s HBM2E semiconductors for its advanced Ascend AI chip, and other firms like Hawking have also placed orders.
While Chinese firms like Huawei and CXMT are making progress in developing HBM2 chips, their efforts could be hindered by the new US restrictions. Samsung may face a major impact from these restrictions compared to its rivals, as it relies more on the Chinese market. SK Hynix, focusing on advanced HBM chip production, has nearly sold out its HBM chips for the next two years, while Micron has already stopped selling its HBM products to China since last year.
Intel will be laying off thousands of workers in an effort to finance its recovery amidst its plummeting revenues and market share. While the US chipmaker is one of the dominant players in the personal computer market, it still hasn’t caught pace with the growing AI chip demand.
Intel’s CEO, Pat Gelsinger, has initiated huge investments in enhancing manufacturing capabilities and improving the company’s tech capabilities. Traditionally focused on designing and producing its chips, Intel will now strive to enter the foundry business to manufacture chips for other companies as well.
Why does this matter?
Intel’s push for innovation is vital at this juncture, where, despite the recent increase in the importance of semiconductors driven by the AI revolution, Intel’s dominance in the semiconductor industry has waned. With competitors like NVIDIA, TSMC, Qualcomm, and MediaTek emerging as industry frontrunners, Intel’s slashing of cost is a bid to reclaim its industry market position.
The Biden administration is set to introduce a new rule expanding US powers to block exports of semiconductor manufacturing equipment to Chinese chipmakers. However, essential allies like Japan, the Netherlands, and South Korea will be exempt, minimising the rule’s overall impact. The additional restriction follows previous export controls aimed at hindering China’s advancements in supercomputing and AI for military purposes.
The new rule will extend the Foreign Direct Product rule, preventing several Chinese semiconductor factories from receiving equipment exports from countries such as Israel, Taiwan, Singapore, and Malaysia. The rule, which has previously targeted Huawei, allows the US to block sales of products made with American technology, even if produced abroad. The exemptions highlight a diplomatic effort to maintain international cooperation while enforcing export controls.
Additionally, the US plans to tighten regulations by reducing the threshold of US content in foreign products subject to export controls.
The rule, still in draft form, is expected to be finalised next month.
While ASML and Tokyo Electron shares surged in response to the exemptions, this development underscores the need for a balanced approach to managing export controls while maintaining solid international alliances.
Samsung Electronics is making strides in developing memory chips essential for the AI market, narrowing the gap with rival SK Hynix. The company has recently received approval from Nvidia for its HBM3 memory chips and anticipates approval for its next generation, HBM3E, within months. The advancement follows months of setbacks, including development challenges and replacing the head of its semiconductor division.
Samsung’s efforts come as the demand for high-bandwidth memory (HBM) is expected to soar, driven by AI advancements. The HBM market is projected to grow from $4 billion in 2022 to $71 billion by 2027. Nvidia’s approval is crucial for Samsung to capitalise on this booming market and improve its revenue and market share despite still trailing SK Hynix.
Why does this matter?
The company has faced significant engineering challenges, particularly with the thermal management of the stacked DRAM chips used in HBM. Under the leadership of Jun Young-hyun, Samsung has focused on resolving these issues and enhancing its technology. The company has also reorganised its HBM team to boost innovation and collaboration.
As Samsung progresses, it aims to ramp up production and meet the growing demand for AI memory chips. With its financial resources and production capacity, the company is well-positioned to address market shortages and secure a significant share of the lucrative AI memory market.
According to a recent research paper, Apple has opted to use Google-designed chips instead of Nvidia’s for two crucial components of its AI software infrastructure. This choice is noteworthy as Nvidia is widely regarded as the leading provider of AI processors. The paper detailed that Apple employed Google’s tensor processing units (TPUs) in large clusters, specifically 2,048 TPUv5p chips for AI models on devices like iPhones and 8,192 TPUv4 processors for server models.
The research paper did not mention any use of Nvidia chips, despite Nvidia dominating about 80% of the AI processor market through its graphics processing units (GPUs). Unlike Nvidia, which sells its GPUs directly, Google offers TPUs through its Google Cloud Platform, requiring customers to use Google’s platform for access.
Why does this matter?
Apple has begun introducing parts of its new AI suite, Apple Intelligence, to beta users. This recent publication only disclosed the full extent of Apple’s reliance on Google hardware, despite earlier reports hinting at this partnership.
Apple’s engineers noted the potential for even larger, more sophisticated AI models using Google’s chips. However, Apple’s stock saw a minor decline of 0.1% to $218.24 following the research paper’s release.
SK Hynix, the world’s second-largest memory chip maker and a key Nvidia supplier, will invest 9.4 trillion won ($6.8 billion) for its inaugural chip plant in South Korea. Kim Young-sik, the company’s head of manufacturing technology, explained that this is a strategic investment for the company in response to the surge in demand for AI semiconductors. The ambitious project will involve building four state-of-the-art semiconductor plants near Seoul. The construction is expected to start in March next year, and its completion is slated for May 2027.
The site will span 4.2 million square meters and will house four cutting-edge chip plants and over 50 local firms in the semiconductor sector. The facility will also boast a ‘mini-fab’ research centre for processing 300-mm silicon wafers, offering local chip materials and equipment manufacturers a realistic environment to test their innovations.
Why does it matter?
It is worth noting that this new fab will be set in the Yongin Semiconductor Cluster near Seoul, where the government aims to build a large-scale chip operations complex. As such, SK Hynix’s investment will help supplement the South Korean government’s efforts to sustain its leadership in-memory technology, which is crucial for AI applications.
Britain has initiated a new technology security partnership with India, aiming to boost economic growth and collaboration in telecom security while fostering investment in emerging technologies. The agreement will enhance cooperation on critical technologies, including semiconductors, quantum computing, and AI.
British Foreign Secretary David Lammy emphasised that this partnership would address future AI and critical minerals challenges, promoting mutual growth, innovation, job creation, and investment. Lammy made these remarks during his visit to India, where he met with Prime Minister Narendra Modi and India’s Minister for External Affairs.
Additionally, both nations have committed to closer collaboration on tackling climate change. That includes mobilising finance and advancing partnerships in offshore wind energy and green hydrogen.
Samsung’s high bandwidth memory chips HBM3 have been approved by Nvidia for use in its AI processors, specifically for the H20 chip developed for the Chinese market, in compliance with the US export controls. Samsung may begin supplying these chips to Nvidia starting in August.
Despite Samsung being one of the world’s largest memory chip manufacturers, it still needed help to get Nvidia to certify its HBM chips. It is still being determined if Nvidia will use Samsung’s HBM3 chips in its other AI processors or if further testing is required. Meanwhile, Samsung’s fifth-generation HBM3E chips are still being tested to meet Nvidia’s standards.
Why does it matter?
AI chips require large amounts of high-speed memory, and HBM is a type of dynamic random access memory (DRAM) with a uniquely stacked memory chip design, which provides the necessary speed and capacity. Although HBM was introduced in 2013, its demand has risen drastically with the AI boom in recent years. Currently, only Micron, Samsung, SK Hynix manufacture HBM chips, and Nvidia have already certified HBM3 chips from Micron and SK Hynix, but there remains a shortage. Thus, Nvidia’s decision to approve Samsung’s HBM3 chips enables Nvidia to expand its supply chain and meet the deficit demand.