TMTPOST -- Nvidia Corp. pushed back against market concerns over rising competition from Google's artificial intelligence ( AI ) chips, asserting in a rare public statement that its technology remains "a generation ahead of the industry." The chipmaker's response came as investor anxiety mounted over whether its dominance in AI infrastructure could face meaningful erosion from Google's tensor processing units ( TPUs ) .

AI Generated Image
The statement on X followed a 2.6% decline in Nvidia shares on Tuesday after The Information reported that Meta Platforms is considering using Google's TPUs in its data centers by 2027. The potential deal would mark a significant validation for Google's in-house chips and intensify competition in a market where Nvidia controls more than 90% of AI chip sales.
"We're delighted by Google's success — they've made great advances in AI and we continue to supply to Google," Nvidia said. "Nvidia offers greater performance, versatility, and fungibility than ASICs [ application specific integrated circuits ] , which are designed for specific AI frameworks or functions." The company emphasized that its platform "runs every AI model and does it everywhere computing is done."
Google responded that it is "experiencing accelerating demand for both our custom TPUs and Nvidia GPUs [ graphics processing units ] ," adding that it remains "committed to supporting both, as we have for years." The search giant's latest Gemini 3 AI model, which received strong reviews earlier this month, was trained entirely on TPUs rather than Nvidia GPUs.
Rare Public Defense After Stock Decline
Nvidia's decision to publicly address competition concerns represents an unusual move for the company. The statement came after shares fell as much as 7% before recovering to close 2.6% lower on Tuesday. Google parent Alphabet rose 1.6%, following a more than 6% rally on Monday.
The Information's report indicated Meta may spend billions of dollars on TPUs for its data centers in 2027 and could rent TPU capacity from Google Cloud next year. Meta currently ranks among Nvidia's largest customers, with projected AI infrastructure capital expenditures between $70 billion and $72 billion this year.
Nvidia CEO Jensen Huang addressed TPU competition on an earnings call earlier this month, noting that Google remains a customer for his company's GPU chips and that Gemini can operate on Nvidia's technology. Huang disclosed in October that Nvidia has visibility into $500 billion in revenue from its Blackwell and Rubin AI platforms through 2026.
How Google's TPUs Challenge Nvidia's GPUs
Google launched its first-generation TPU in 2018, initially designed for internal use in its cloud computing business. The chips were specifically built for matrix multiplication operations central to training neural networks, making them more specialized but also more power-efficient than Nvidia's GPUs for certain AI workloads.
Unlike Nvidia, Google doesn't sell TPU chips directly to other companies. Instead, it uses them for internal tasks and allows companies to rent computing power through Google Cloud. The company has recently expanded this approach, pitching some customers including Meta and major financial institutions on using TPUs in their own data centers through a program called TPU@Premises.
TPUs represent a form of ASIC, customized for particular tasks. Google has told customers that TPUs can help meet higher security and compliance standards for sensitive data and could benefit high-frequency trading firms running AI models in their facilities. The company's latest TPU version, Ironwood, unveiled in April, features liquid cooling and comes in configurations of up to 9,216 chips.
Current TPU customers include Anthropic, Salesforce, and Midjourney. Under a deal unveiled in October, Anthropic gained access to as many as 1 million TPUs, though the AI startup also announced a major deal with Nvidia weeks later, underscoring that customers seek diversified chip supply rather than complete replacement of Nvidia technology.


登录后才可以发布评论哦
打开小程序可以发布评论哦