Topics

up-to-the-minute

AI

Amazon

Article image

Image Credits:TechCrunch/Frederic Lardinois

Apps

Biotech & Health

clime

Article image

Image Credits:TechCrunch/Frederic Lardinois

Cloud Computing

Commerce

Crypto

endeavor

EVs

Fintech

fundraise

appliance

back

Google

Government & Policy

computer hardware

Instagram

Layoffs

Media & Entertainment

Meta

Microsoft

concealment

Robotics

Security

societal

Space

Startups

TikTok

shipping

Venture

More from TechCrunch

Events

Startup Battlefield

StrictlyVC

newssheet

Podcasts

Videos

Partner Content

TechCrunch Brand Studio

Crunchboard

Contact Us

It’sAWS re : excogitate 2024this week , Amazon ’s annual cloud computing extravaganza in Las Vegas , and as is tradition , the companionship has so much to herald , it ca n’t equip everything into its five ( ! ) keynote . forward of the show ’s prescribed opening , AWS on Monday detailed a number of updates to its overall data nitty-gritty strategy that are worth paying attention to .

The most important of these is that AWS will soon commence using liquidness cooling for its AI servers and other machine , irrespective of whether those are based on its homegrown Trainium chips and Nvidia ’s accelerators . Specifically , AWS notes that its Trainium2 chip shot ( which are still in trailer ) and “ wheel - scale AI supercomputing solutions like NVIDIA GB200 NVL72 ” will be cooled this way .

It ’s worth highlighting that AWS stresses that these updated cool systems can integrate both zephyr and liquid temperature reduction . After all , there are still plenty of other server in the data centers that handle networking and reposition , for case , that do n’t require swimming cooling system . “ This flexible , multimodal cooling conception earmark AWS to leave maximal performance and efficiency at the lowest cost , whether running traditional workload or AI models , ” AWS explains .

The company also announced that it is moving to more simplified electrical and mechanically skillful designes for its server and server racks .

“ AWS ’s latest datum center invention improvements include simplified electric dispersion and mechanical arrangement , which enable substructure availability of 99.9999 % . The simplified system also boil down the potential numeral of racks that can be impacted by electric issues by 89 % , ” the troupe note in its announcement . In part , AWS is doing this by reducing the bit of times the electrical energy gets convince on its way from the electrical internet to the server .

AWS did n’t provide many more item than that , but this likely imply using DC superpower to start the servers and/or HVAC system and avoiding many of the AC - DC - AC conversion steps ( with their default losses ) otherwise necessary .

“ AWS continues to relentlessly introduce its substructure to build the most performant , resilient , secure , and sustainable swarm for customers worldwide , ” said Prasad Kalyanaraman , frailty chairman of Infrastructure Services at AWS , in Monday ’s proclamation . “ These data plaza capabilities act an crucial step onward with increase energy efficiency and flexible support for emerge workloads . But what is even more exciting is that they are designed to be modular , so that we are able to retrofit our existing infrastructure for liquid state cooling and energy efficiency to power reproductive AI diligence and lower our atomic number 6 footprint . ”

Join us at TechCrunch Sessions: AI

Exhibit at TechCrunch Sessions: AI

In aggregate , AWS enjoin , the young multimodal cooling system and upgraded power deliverance system of rules will countenance the system “ sustain a 6x step-up in rack power concentration over the next two years , and another 3x gain in the future . ”

In this context of use , AWS also notes that it is now using AI to predict the most effective way to set racks in the information center to shrink the amount of fresh or underutilized power . AWS will also rove out its own control system across its electrical and mechanically skillful twist in the data point snapper , which will come with built - in telemetry services for real - time nosology and troubleshooting .

“ information centers must evolve to meet AI ’s transformative demands , ” said Ian Buck , vice President of the United States of hyperscale and HPC at Nvidia . “ By enabling advanced liquid cooling answer , AI infrastructure can be expeditiously cool while minimizing vigour use . Our work with AWS on their liquid cool down single-foot design will allow customers to run demanding AI workloads with special performance and efficiency . ”

From the Storyline:AWS re:Invent 2024: Live updates from Amazon’s biggest event

Amazon ’s re : invent 2024 conference returns to Las Vegas for a series of reveals and keynotes through December 6 . AI is …