Advertisment

HPE introduces 100% Fanless Liquid Cooling System for AI Efficiency

This is world's 100% fanless direct liquid cooling system, designed to boost energy efficiency in large-scale AI deployments. The system reduces cooling power consumption by 37% per server blade and offers a 90% reduction.

author-image
DQINDIA Online
New Update
cooling

DALL·E 2024 Image

Listen to this article
0.75x 1x 1.5x
00:00 / 00:00

Hewlett Packard Enterprise (HPE) has unveiled the industry's first 100% fanless direct liquid cooling system, designed to improve the energy efficiency of large-scale AI deployments. Announced at HPE’s AI Day, held at its cutting-edge AI systems manufacturing facility, this innovation highlights HPE’s leadership in AI infrastructure and cooling solutions.

Advertisment

With AI workloads demanding more power, traditional cooling methods are becoming less effective. HPE’s fanless direct liquid cooling system is a breakthrough technology, reducing cooling power consumption by 37% per server blade, resulting in lower utility costs, reduced carbon emissions, and a quieter data center environment.

HPE’s solution also delivers a 90% reduction in cooling power usage compared to traditional air-cooled systems. This architecture supports high-density AI systems, offering scalability and flexibility while occupying half the floor space of conventional setups.

“Organizations embracing AI must also focus on sustainability, power efficiency, and cost reduction,” said Antonio Neri, CEO of HPE. “Our fanless direct liquid cooling architecture addresses these challenges, positioning us as a leader in energy-efficient AI deployments.”

Advertisment

The architecture features an 8-element cooling design, integration with high-density systems, and an open design that accommodates various accelerators. This positions HPE to meet the rising demands of AI across industries, reinforcing its role as a global leader in energy-efficient supercomputing

The Role of Direct Liquid Cooling in Sustainable AI Infrastructure

As AI workloads become more power-intensive, traditional cooling systems struggle to keep up with the growing energy demands. Direct liquid cooling, a more efficient method, is emerging as a key technology to sustain the rapid expansion of AI. By using liquid to directly cool critical components like GPUs, CPUs, and server blades, this approach reduces energy consumption, lowers operational costs, and supports high-density server deployments.

Advertisment

 

For organizations running large AI models, this innovative cooling method allows them to meet their computing needs while aligning with sustainability goals. As liquid cooling becomes more mainstream, it holds the potential to transform data centers by improving efficiency and reducing their environmental footprint.

Advertisment