Google Launches Ironwood TPU to Compete in AI Chip Market

Google is making its seventh-generation TPU, called Ironwood, publicly available as it aims to win more AI infrastructure business and challenge Nvidia’s dominance in model training and deployment.

By Maria Konash Published: Updated:
Google Launches Ironwood TPU to Compete in AI Chip Market
Google expands access to its most advanced TPU Photo: AS Photography / pexels.com

Google is opening public access to its most powerful Tensor Processing Unit, a move designed to strengthen its position in the rapidly expanding artificial intelligence infrastructure market. The Ironwood TPU, the company’s seventh-generation custom chip, will be available to cloud customers in the coming weeks after an initial testing phase earlier this year.

Built in-house, Ironwood is engineered to support both training and inference demands across large language models, real-time conversational applications, and autonomous AI systems. The chip can be scaled significantly, with Google enabling customers to interconnect up to 9,216 TPUs within a single pod. The company says this configuration reduces data bottlenecks and enables the operation of today’s largest computational workloads.

Major AI firms are already adopting the technology. Anthropic plans to use up to one million Ironwood TPUs to power its Claude models, signaling confidence in Google’s custom silicon and its potential cost advantages over traditional GPU solutions.

Competing in the AI Infrastructure Race

Tech giants including Microsoft, Amazon, and Meta are racing to provide the compute foundation for AI development, where Nvidia’s GPUs have long dominated. Custom silicon such as Google’s TPU line is emerging as a key differentiator, promising improvements in performance, pricing, and energy efficiency.

Google has invested in TPU technology for a decade. According to the company, Ironwood delivers more than four times the performance of the previous generation, reinforcing Google Cloud’s push to become a preferred platform for AI workloads. Alongside the chip launch, Google is also upgrading cloud capabilities to deliver lower costs and faster deployment for enterprise customers.

Cloud services remain central to Google’s AI ambitions. The company reported third-quarter cloud revenue of $15.15 billion, up 34 percent from a year earlier, outpacing growth at Amazon Web Services though still behind Microsoft Azure’s reported 40 percent rise. Google noted it has secured more billion-dollar cloud agreements in the first nine months of 2025 than during the prior two years combined.

To support rising infrastructure demand, Google has increased its capital spending forecast for the year to $93 billion, up from its earlier projection of $85 billion.

CEO Sundar Pichai told investors that AI infrastructure remains a critical growth driver, noting strong demand for both TPU-based and GPU-based solutions. He said the company is continuing to expand its footprint to accommodate expected future requirements from enterprises deploying large-scale AI systems.

Google’s broader strategy now hinges on offering high-performance chips like Ironwood at scale, positioning its cloud as a compelling alternative for companies building and operating advanced AI models.