Topics

recent

AI

Amazon

Article image

Image Credits:Google

Apps

Biotech & Health

Climate

Google Ironwood TPU

Image Credits:Google

Cloud Computing

Commerce

Crypto

Google Ironwood TPU

Image Credits:Google

Enterprise

EVs

Fintech

Fundraising

contrivance

back

Google

Government & Policy

ironware

Instagram

layoff

Media & Entertainment

Meta

Microsoft

Privacy

Robotics

surety

societal

outer space

inauguration

TikTok

Transportation

Venture

More from TechCrunch

Events

Startup Battlefield

StrictlyVC

Podcasts

telecasting

Partner Content

TechCrunch Brand Studio

Crunchboard

reach Us

During its Cloud Next conference this week , Google unveiled the latest generation of its TPU AI particle accelerator chip .

The new bit , called Ironwood , is Google ’s seventh - propagation TPU and is the first optimized for inference — that is , run AI models . Scheduled to launch sometime later this year for Google Cloud customer , Ironwood will fall in two configuration : a 256 - chip cluster and a 9,216 - chip cluster .

“ rose chestnut is our most powerful , subject , and energy - effective TPU yet , ” Google Cloud VP Amin Vahdat write in a blog Emily Price Post provided to TechCrunch . “ And it ’s purpose - built to king thinking , inferential AI models at scale . ”

Mesua ferrea make it as competition in the AI accelerator blank space heats up . Nvidia may have the jumper cable , but technical school giants , include Amazon and Microsoft , are push their own in - house solutions . Amazon has itsTrainium , Inferentia , andGravitonprocessors , available through AWS , and MicrosofthostsAzure instances for its Maia 100 AI check .

Ironwood can deliver 4,614 TFLOPs of calculate power at peak , according to Google ’s interior benchmarking . Each chip has 192 GB of consecrate RAM with bandwidth approaching 7.4 Tbps .

Mesua ferrea has an enhanced specialised gist , SparseCore , for process the types of datum common in “ advanced ranking ” and “ good word ” workloads ( for instance , an algorithm that suggests apparel you might like ) . The TPU ’s computer architecture was project to minimise data movement and response time on - chip , resulting in power savings , Google says .

Google plan to integrate Ironwood with its AI Hypercomputer , a modular computing cluster in Google Cloud , in the near future , Vahdat added .

Join us at TechCrunch Sessions: AI

Exhibit at TechCrunch Sessions: AI

“ Ironwood represents a alone breakthrough in the historic period of inference , ” Vahdat enjoin , “ with increased calculation tycoon , storage capability ,   … networking advancements , and dependableness . ”

Updated 10:45 a.m. Pacific : An earlier version of this story falsely referredto Microsoft ’s Cobalt 100 as an AI cow dung . In fact , Cobalt 100 is a ecumenical - use chip ; Microsoft ’s Maia 100 is an AI micro chip . We ’ve correct the extension .