Google is making its most powerful chip yet widely available, the search giant’s latest effort to try and win business from artificial intelligence companies by offering custom silicon.
Axar.az informs, citing NBC news, the company said on Thursday that the seventh generation of its Tensor Processing Unit (TPU), called Ironwood, will hit the market for public use in the coming weeks, after it was initially introduced in April for testing and deployment.
The chip, built in-house, is designed to handle everything from the training of large models to powering real-time chatbots and AI agents. In connecting up to 9,216 chips in a single pod, Google says the new Ironwood TPUs eliminate “data bottlenecks for the most demanding models” and give customers “the ability to run and scale the largest, most data-intensive models in existence.”
Google is in the midst of an ultra high-stakes race, alongside rivals Microsoft, Amazon and Meta, to build out the AI infrastructure of the future. While the majority of large language models and AI workloads have relied on Nvidia’s graphics processing units (GPUs), Google’s TPUs fall into the category of custom silicon, which can offer advantages on price, performance and efficiency.
TPUs have been in the works for a decade. Ironwood, according to Google, is more than four times faster than its predecessor, and major customers are already lining up. AI startup Anthropic plans to use up to 1 million of the new TPUs to run its Claude model, Google said.
Alongside the new chip, Google is rolling out a suite of upgrades meant to make its cloud cheaper, faster, and more flexible, as it vies with larger cloud players Amazon Web Services and Microsoft Azure.