News
News Categories

NVIDIA adds liquid-cooled GPUs to lower power draw and cost in data center servers

By Vijay Anand - on 24 May 2022, 1:24pm

NVIDIA adds liquid-cooled GPUs to lower power consumption in high performance data center servers

The all-new liquid-cooled data-center GPU accelerator. (Source: NVIDIA)

As much as NVIDIA's supercomputers with their advanced accelerated compute engines help massively save space and power consumed against traditional data center servers, rack space and power are still key operational costs. To take the next leap to better optimize mainstream data center servers, NVIDIA is now announcing liquid-cooled GPU options for them.

The first of them will be the NVIDIA A100 PCIe accelerator that's based on the existing well-proven Ampere GPU architecture which will easily slip into an industry-standard HGX A100 AI cloud data center server platform.

Firstly, a liquid-cooled option helps reduce space occupied by the A100 PCIe accelerator as opposed to being air-cooled. The single-slot liquid-cooled A100 GPU will occupy half the space of an air-cooled counterpart since the heatsink array is almost non-existent and external blowers on the rack behind the accelerators are now eradicated. This allows NVIDIA to offer denser solutions and thus increase performance for the rack space occupied, or reduce the cost of the rack space occupied without adding more performance opportunities. NVIDIA claims this helps reduce rack space by up to 66%, a staggering figure.

Liquid-cooled GPU roadmap for data centers; click to view larger image. (Source: NVIDIA)

Secondly with energy efficiency being key with regards to carbon footprint and cost of operations, Equinix, a global service provider managing numerous data centres around the world, and NVIDIA have found in separate testing that a data center using liquid cooling can run the same workload as an air-cooled facility but at 30 per cent less energy consumption.

Equinix is currently qualifying the A100 80GB PCIe Liquid-Cooled GPU for use in its data centers as part of a comprehensive approach to sustainable cooling and heat capture. The GPUs are sampling now and will be
generally available this summer (Q3 2022).

NVIDIA sees power savings, and density gains with liquid cooling. (Image Source: NVIDIA)

At least a dozen system makers plan to incorporate these GPUs into their offerings later this year. They include ASUS, ASRock Rack, Foxconn Industrial Internet, GIGABYTE, H3C, Inspur, Inventec, Nettrix, QCT,
Supermicro, Wiwynn and xFusion.

Following the A100 PCIe liquid-cooled GPU, NVIDIA will also ready the H100 PCIe for liquid cooling and plans to have this out in early 2023 to accompany the HGX H100 server platforms that will be debuting later this year. To recap, the NVIDIA H100 Hopper accelerated compute GPU is up to six times more powerful than the Ampere-based A100 GPUs, so these will bring a massive gain in compute performance for space-constrained environments.

Join HWZ's Telegram channel here and catch all the latest tech news!
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.