Anthropic has agreed a multi-billion dollar deal with Google that will give it access to up to one million tensor processing units (TPUs) on Google Cloud. The units will be used to train and run its large language models.
The company said the agreement is worth “tens of billions of dollars” and could provide more than a gigawatt of computing power by 2026.
The announcement follows a strong year for Anthropic. The number of customers with annual run-rate revenue above $100,000 grew nearly sevenfold during the past year, reflecting growing demand for AI services.
Krishna Rao, Anthropic’s chief financial officer, said the deal builds on its existing partnership with Google Cloud. He cited the cost, performance, energy efficiency, and the positive relationship between the companies as factors in the decision.
Google’s TPUs are designed for efficient power delivery. They are typically more energy-efficient than Nvidia GPUs, which are commonly used by AI companies, including OpenAI. The deal may also support Google as it challenges Nvidia’s dominance in AI hardware.
Thomas Kurian, chief executive of Google Cloud, said the company is “continuing to innovate and drive further efficiencies and increased capacity of our TPUs, building on our already mature AI accelerator portfolio, including our seventh-generation TPU, Ironwood.”
Anthropic does not rely solely on Google. Its multi-vendor approach includes Nvidia GPUs and Amazon’s Trainium processors. Amazon remains Anthropic’s primary training partner and cloud provider.