Google is secretly working on Torch TPU, it could break Nvidia’s grip on AI chips
Google is reportedly planning to reduce Nvidia’s dominance in AI hardware. The company is said to be working with Meta to make its TPUs more compatible with PyTorch, a widely used AI framework. The project is called “Torch TPU” internally.


Google is reportedly ready to collaborate with Meta to reduce Nvidia’s dominance in AI computing. According to Reuters, the tech giant is planning to make its Tensor Processing Units (TPUs) more compatible with Meta’s PyTorch, the most widely used AI development framework. This could in turn allow more companies in the industry to use TPU instead of relying on Nvidia’s graphics processing units (GPUs), which have become the backbone of large-scale AI and machine learning models around the world.
Why does Google need Meta’s help?
Google TPU is a major driving force for the development of the company’s cloud business. However, these chips are optimized for Google’s own AI tool JAX rather than PyTorch. PyTorch is an open-source AI development framework released by Meta in 2016. It is reportedly the most used AI tool by developers. But this framework is better optimized for Nvidia’s GPUs. This has created significant friction for companies interested in using alternative chips, such as Google’s TPU.
How will Google and Meta reduce the industry’s dependence on Nvidia?
Google’s new initiative, known internally as “Torch TPU,” is the answer to this problem. By focusing on improving PyTorch support, Google hopes to allow organizations to move their AI workloads from Nvidia GPUs to TPUs without major code rewrites or hardware changes. According to the report, Torch TPU is getting increased organizational focus and resources to undercut Nvidia. To make the switch easier for companies, Google is also considering open-sourcing parts of the software.
Meta is reportedly playing a central role in this effort. The two companies are collaborating closely on “Torch TPU” as Meta explores the adoption of TPU in billion-dollar deals. This partnership could help Google expand its TPU adoption, while Meta reduces its dependence on Nvidia.
A Google spokesperson confirmed that the move will provide Google customers with more choices in the AI computing field. “Our focus is on providing developers with the flexibility and scale they need, no matter what hardware they want to build,” the spokesperson told Reuters.
Previously, Google kept most of its TPUs for home use. This changed in 2022, when oversight of TPU sales was given to Google Cloud, leading to increased production and an emphasis on acquiring more AI workloads from external customers.
Nvidia remains the biggest player when it comes to AI hardware. Last month, OpenAI signed a $38 billion deal with Amazon Web Services (AWS) to harness the latter’s computing power powered by Nvidia GPUs. The AI boom has made Nvidia one of the most valuable companies, surpassing the $4 trillion market cap earlier this year.

