
TLDR
- Google announced its 8th-generation TPUs: the TPU 8t for training and TPU 8i for inference
- The TPU 8i delivers 80% better performance-per-dollar than the previous Ironwood chip
- Both chips were co-developed with Broadcom and designed alongside Google DeepMind
- The training chip, TPU 8t, scales to 9,600 chips and doubles Ironwood’s interchip bandwidth
- Both chips will be available to Google Cloud customers later this year
Google unveiled two new custom AI chips on Wednesday, splitting its tensor processing unit lineup into separate processors for training and inference for the first time.
Google Cloud unveiled the latest generation of its tensor processing unit, or TPU, a homegrown chip that’s designed to make AI computing services faster and more efficient https://t.co/MkGU7h2SkT
— Bloomberg (@business) April 22, 2026
The eighth-generation TPU 8t handles AI model training, while the TPU 8i is built for inference — running those models in production. Both were co-developed with Broadcom, continuing a partnership that stretches back over a decade.
Alphabet Inc., GOOGL
The move marks a shift in strategy. Previous TPU generations handled both tasks on a single chip. Google says the rise of agentic AI — where models operate in continuous loops with little human input — demands more specialized hardware.
“With the rise of AI agents, we determined the community would benefit from chips individually specialized to the needs of training and serving,” said Amin Vahdat, Google’s SVP and chief technologist for AI and infrastructure.
The TPU 8i inference chip contains 384 megabytes of SRAM per chip — triple the amount found in Ironwood. Google says this eliminates what it calls the “waiting room” effect, the latency that builds up when many users are hitting a model at the same time.
Inference Gets a Major Upgrade
The TPU 8i delivers 80% better performance-per-dollar compared to Ironwood. In practical terms, that means users can handle nearly twice the demand at the same cost.

It also delivers up to two times better performance-per-watt, thanks to integrated power management that adjusts draw based on demand.
Both chips now run on Google’s Axion CPU host for the first time, allowing system-level efficiency rather than just chip-level gains.
On the training side, the TPU 8t superpod scales to 9,600 chips and 2 petabytes of high-bandwidth memory. That’s double the interchip bandwidth of Ironwood, and Google says it can cut frontier model development time from months to weeks.
The training chip also delivers 2.8 times the performance of Ironwood’s seventh generation at the same price point.
Who’s Already Using the Chips
Adoption is growing. Citadel Securities built quantitative research software on Google’s TPUs. All 17 U.S. Department of Energy national laboratories run AI co-scientist software on them. Anthropic has committed to using multiple gigawatts worth of Google TPU capacity.
DA Davidson analysts estimated in September that the TPU business, combined with Google DeepMind, could be worth around $900 billion.
Google does not sell TPUs externally — they are only available through Google Cloud. Nvidia remains Google’s chip supplier for GPUs, and Google confirmed it will be among the first cloud providers to offer Nvidia’s upcoming Vera Rubin platform later this year.
The new chips were also designed alongside Google DeepMind, which has used them to train Gemini models and power Search and YouTube algorithms.
Google said both the TPU 8t and TPU 8i will be generally available to cloud customers later this year.
🚨 Our April Stock Picks Are Live!
A new month means new opportunities. Our analysts have just released their top stock picks for April, highlighting companies with strong momentum that rank highly on our KO Score algorithm. We’re also now sharing trade ideas for both long-term and short-term investors, giving you more ways to spot potential opportunities in the market.
Sign up to Knockout Stocks today and get 50% off to unlock the full list and see which stocks made the cut.
Use coupon code Special50 for your exclusive discount!
