New fanless cooling technology enhances energy efficiency for AI workloads by achieving a 90% reduction in cooling power consumption

A person standing in front of a rack of servers inside a data center
(Image credit: Shutterstock.com / Gorodenkoff)

  • New HPE fanless cooler cuts server blade power consumption by 37%
  • The system uses direct liquid cooling, perfect for AI technologies
  • The architecture is designed to be scalable dependent on business needs

Hewlett-Packard Enterprise (HPE) recently hosted its AI Day 2024 event, introducing the industry’s first 100% fanless direct liquid cooling architecture.

As artificial intelligence (AI) technologies continue to evolve, power consumption in next-generation accelerators has increased, surpassing the capabilities of traditional air-cooling methods.

Organizations running large-scale AI workloads are now searching for more efficient solutions to manage their infrastructure’s energy demands, and hPE has pioneered direct liquid cooling technology, which has become one of the most effective methods for cooling high-performance AI systems. This approach has allowed HPE to deliver seven of the top 10 most energy-efficient supercomputers on the Green500 list.

100% fanless direct liquid cooling addresses cooling challenges in AI systems

The new cooling system is designed to improve efficiency in several key areas, with HPE saying the fanless architecture reduces cooling power consumption by 90% compared to traditional air-cooling systems and offers significant environmental and financial advantages.

The system is built on four core elements. First, the system uses a comprehensive cooling design featuring an 8-element system that cools the GPU, CPU, server blade, local storage, network fabric, rack, cluster, and coolant distribution unit (CDU).

The second element is that the fanless cooler also offers high-density performance that supports compact configurations backed by rigorous testing, monitoring software, and on-site services to ensure smooth deployment.

Thirdly, for those with a mind for the environment, the new system uses an integrated network fabric that enables large-scale connectivity with reduced costs and power consumption for a more sustainable architecture. Lastly, the architecture runs on an open system design that offers flexibility by supporting various accelerators, allowing organizations to select solutions that best suit their needs.

The fanless architecture reduces cooling power consumption by 37% per server blade compared to hybrid liquid-cooled systems, not only lowering utility costs but also reducing carbon emissions and eliminating data center fan noise. Furthermore, the design allows for higher server cabinet density, helping organizations cut floor space requirements in half.

"As organizations embrace the possibilities created by generative AI, they also must advance sustainability goals, combat escalating power requirements, and lower operational costs,” noted Antonio Neri, President and CEO of HPE.

“The architecture we unveiled today uses only liquid cooling, delivering greater energy and cost-efficiency advantages than the alternative solutions on the market. In fact, this direct liquid cooling architecture yields 90% reduction in cooling power consumption as compared to traditional air-cooled systems."

You might also like

Efosa Udinmwen
Freelance Journalist

Efosa has been writing about technology for over 7 years, initially driven by curiosity but now fueled by a strong passion for the field. He holds both a Master's and a PhD in sciences, which provided him with a solid foundation in analytical thinking. Efosa developed a keen interest in technology policy, specifically exploring the intersection of privacy, security, and politics. His research delves into how technological advancements influence regulatory frameworks and societal norms, particularly concerning data protection and cybersecurity. Upon joining TechRadar Pro, in addition to privacy and technology policy, he is also focused on B2B security products.