Topics

modish

AI

Amazon

Article image

Apps

Biotech & Health

Climate

Article image

Cloud Computing

Commerce

Crypto

Article image

Enterprise

EVs

Fintech

Read more about Google I/O 2024 on TechCrunch

Fundraising

Gadgets

punt

Google

Government & Policy

Hardware

Instagram

layoff

Media & Entertainment

Meta

Microsoft

Privacy

Robotics

surety

Social

Space

inauguration

TikTok

conveyance

speculation

More from TechCrunch

Events

Startup Battlefield

StrictlyVC

Podcasts

Videos

Partner Content

TechCrunch Brand Studio

Crunchboard

Contact Us

At its Google I / group O developer conference , Google on Tuesdayannouncedthe next coevals of itsTensor Processing Units(TPU ) , its data point center AI chip . This 6th generation of chip , dub Trillium , will launch by and by this year .

“ Google was establish for this mo . We ’ve been pioneer GPUs for more than a decade , ” Google CEO Sundar Pichai said in a public press briefing ahead of conference .

foretell the next generation of TPUs is something of a tradition at I / O , even as the chips only ramble out by and by in the class . When they do arrive , though , they will sport a 4.7x performance encouragement in compute execution per chip when compare to the fifth coevals , according to Pichai .

In part , Google achieved this by expand the chip ’s matrix multiply units ( MXUs ) and by pushing the overall clock speed . In addition , Google also doubled the memory bandwidth for the Trillium chips .

What ’s maybe even more important , though , is that Trillium feature the third generation of SparseCore , which Google describe as “ a specialized accelerator for swear out ultra - large embeddings mutual in advanced ranking and recommendation work load . ” This , the company argues , will give up Trillium TPUs to train simulation quicker and do them with low-toned rotational latency .

Pichai also described the new poker chip as Google ’s “ most energy - efficient ” TPUs yet , something that ’s especially authoritative as the demand for AI chip continues to increase exponentially . “ industriousness demand for ML compute has raise by a factor of 1 million in the last six yr , just about increasing tenfold every year , ” he allege . That ’s not sustainable without investing in boil down the power requirement of these chips . Google promises that the new TPUs are 67 % more free energy - effective than the fifth - propagation chips .

Google ’s TPUs late tended to come in a telephone number of variants . So far , Google did n’t provide any additional detail about the new buffalo chip , or how much using them would cost in the Google Cloud .

Join us at TechCrunch Sessions: AI

Exhibit at TechCrunch Sessions: AI

in the beginning this yr , Google also foretell that it would be among the first swarm provider to extend access to Nvidia ’s next - gen Blackwell processors . That still signify developers will have to wait until early 2025 to get access to these chip shot , though .

“ We ’ll bear on to invest in the infrastructure to power our AI advances and we ’ll continue to break new undercoat , ” Pichai said .

We ’re launch an AI newssheet ! Sign uphereto start receive it in your inboxes on June 5 .