Amazon Internet Companies is constructing tools to chill Nvidia GPUs as AI growth accelerates


The letters AI, which stands for “synthetic intelligence,” stand on the Amazon Internet Companies sales space on the Hannover Messe industrial commerce honest in Hannover, Germany, on March 31, 2025.

Julian Stratenschulte | Image Alliance | Getty Pictures

Amazon mentioned Wednesday that its cloud division has developed {hardware} to chill down next-generation Nvidia graphics processing items which are used for synthetic intelligence workloads.

Nvidia’s GPUs, which have powered the generative AI growth, require huge quantities of vitality. Which means corporations utilizing the processors want further tools to chill them down.

Amazon thought of erecting knowledge facilities that would accommodate widespread liquid cooling to benefit from these power-hungry Nvidia GPUs. However that course of would have taken too lengthy, and commercially obtainable tools would not have labored, Dave Brown, vp of compute and machine studying providers at Amazon Internet Companies, mentioned in a video posted to YouTube.

“They might take up an excessive amount of knowledge heart ground area or enhance water utilization considerably,” Brown mentioned. “And whereas a few of these options may work for decrease volumes at different suppliers, they merely would not be sufficient liquid-cooling capability to help our scale.”

Reasonably, Amazon engineers conceived of the In-Row Warmth Exchanger, or IRHX, that may be plugged into current and new knowledge facilities. Extra conventional air cooling was ample for earlier generations of Nvidia chips.

Clients can now entry the AWS service as computing situations that go by the title P6e, Brown wrote in a weblog submit. The brand new programs accompany Nvidia’s design for dense computing energy. Nvidia’s GB200 NVL72 packs a single rack with 72 Nvidia Blackwell GPUs which are wired collectively to coach and run giant AI fashions.

Computing clusters primarily based on Nvidia’s GB200 NVL72 have beforehand been obtainable by means of Microsoft or CoreWeave. AWS is the world’s largest provider of cloud infrastructure.

Amazon has rolled out its personal infrastructure {hardware} up to now. The corporate has customized chips for general-purpose computing and for AI, and designed its personal storage servers and networking routers. In operating homegrown {hardware}, Amazon relies upon much less on third-party suppliers, which may profit the corporate’s backside line. Within the first quarter, AWS delivered the widest working margin since at the least 2014, and the unit is accountable for most of Amazon’s web earnings.

Microsoft, the second largest cloud supplier, has adopted Amazon’s lead and made strides in chip improvement. In 2023, the corporate designed its personal programs referred to as Sidekicks to chill the Maia AI chips it developed.

WATCH: AWS declares newest CPU chip, will ship document networking pace