Inspur Information, a IT infrastructure solution provider, announced its new liquid cooling AI server, NF5488LA5 — at the ISC High Performance 2021 Digital event.
Inspur’s NF5488LA5 is designed with a liquid cold plate and a maximum capability of supporting eight high-speed and interconnected NVIDIA A100 Tensor Core GPUs via NVSwitch. The latest liquid cooling AI server is ideal for customers who need a high-performance and energy-efficient AI server.
The new NF5488LA5 is designed to meet the energy-saving needs required by High-Performance Computing (HPC) and Artificial Intelligence (AI). The NF5488LA5 is an update on Inspur’s leading AI server NF5488A5.
NF5488LA5 comes with two AMD EYPC 7003 series processors and eight NVIDIA A100 Tensor GPUs in a 4U chassis fully connected by NVSwitch. The GPU-to-GPU communication bandwidth reaches 600GB/s, thus enabling lower latency.
The liquid cold plate on the NF5488LA5 covers CPUs, GPUs and NVSwitches. The Liquid cooling power consumption accounts for 80 percent of the total consumption, effectively reducing Power Usage Effectiveness (PUE) to 1.1.
The GPU cold plate is designed with a parallel connection of four water loops, which enables the liquid to flow through the surface of GPU and NVSwitch consecutively for high-efficiency cooling of the server component that generates the most heat.
High-efficiency liquid-cooling is among the major reasons that NF5488LA5 ranks No.1 in 11 of the 16 tests in the closed data center division of the 2021 MLPerf Inference V1.0 Benchmark. It is also the only GPU server submitted that ran the NVIDIA A100 GPU at 500W TDP via liquid cooling technology.
Inspur NF5488LA5 can be connected to a mobile Coolant Distribution Unit (CDU). After connecting it to the RACKCDU-F008 mobile liquid-cooling CDU with quick release connectors, customers can place the units directly in the general air-cooling cabinet, without the need to set up primary side cooling units or rearranging the entire cooling system in the server room.