Google’s AI chip push spurs hope for Samsung, SK hynix

Public TV English
4 Min Read

SEOUL: As Google emerges as a formidable new force in the artificial intelligence hardware space with its tensor processing units, expectations are rising that surging demand for high-bandwidth memory could benefit South Korean semiconductor giants such as Samsung Electronics and SK Hynix, as per a report by Pulse, the English service of Maeil Business News Korea.

The news report, quoting industry sources, noted that Google is actively seeking to supply its TPU chips — already deployed in its in-house AI model Gemini 3 — to other Big Tech firms, including Meta, the parent company of Facebook and Instagram.

Meta is reportedly considering adopting Google’s TPUs for data centers slated to begin operation in 2027. Jointly developed by Google and US chipmaker Broadcom, TPUs are designed to power AI workloads with efficiency and speed, positioning them as a competitive alternative to Nvidia’s dominant graphics processing units.

Google’s TPU reportedly delivers comparable or superior AI performance without relying on Nvidia’s hardware. What sets TPUs apart is their cost-efficiency.

As per the news report, industry estimates suggest that TPUs are up to 80 per cent cheaper than Nvidia’s flagship H100 GPU. While Google’s seventh-generation TPU, dubbed ‘Ironwood’, may fall short of Nvidia’s next-generation ‘Blackwell’ chips in sheer computational power, it is still considered more performant than the H200.

Nvidia has long maintained a virtual monopoly over the AI chip market, with a market share exceeding 90 per cent. However, a shift toward diversified suppliers — especially if TPUs gain broader adoption — is expected to improve profitability across the semiconductor supply chain, benefiting major memory players like Samsung and SK Hynix, the news report has asserted.

One key reason is that each TPU contains six to eight HBM modules, making TPU expansion directly tied to HBM demand. Notably, SK Hynix is already supplying fifth-generation HBM3E chips for Google’s Ironwood, and industry observers suggest that the company will likely provide 12-layer HBM3E modules for the next-generation TPU, codenamed “7e.”

“The rise in Google’s HBM adoption will act as a catalyst, exacerbating the current supply shortage,” said Chae Min-sook, an analyst at Korea Investment & Securities. “With both average selling prices and shipment volumes on the rise, SK hynix and Samsung Electronics are well-positioned to reap dual benefits”.

The anticipated boom in AI data centers is also expected to drive up demand for conventional DRAM products — such as DDR5 and LPDDR5 — used alongside GPUs, TPUs and central processing units, further lifting memory sales, the Pulse report noted.

At the same time, as Taiwan’s TSMC, the global foundry leader, continues to raise prices for advanced processes, Samsung foundry is gaining attention as a viable alternative due to recent yield improvements in 3-nanometer and 2-nanometer nodes, the news report added. Samsung’s capacity to offer turnkey solutions — integrating memory, foundry and advanced packaging — is also being viewed as a strategic edge.

“The expansion of Google’s AI ecosystem through TPUs could lead to a broader range of benefits for Samsung, including increased memory shipments, higher utilization rates for its cutting-edge foundry lines and even stronger sales of Galaxy smartphones powered by Gemini AI,” said Kim Dong-won, an analyst at KB Securities.

The Pulse report, quoting an anonymous industry source, added that Samsung’s upcoming fabrication plant in Taylor, Texas, which will be capable of producing chips below the 2-nanometer threshold, could present a major opportunity if the TPU market continues to expand. (ANI)

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *