Unlocking the Full Potential of AI with Liquid Cooling
As OpenAI continues to advance its models, the demand for data centre resources is increasing. Vivek Swaminathan, Director of Products and Solutions at Unisys, believes that liquid cooling could be the key to fully harnessing AI’s capabilities.
The latest AI models from OpenAI are revolutionizing technology, making AI more accessible and cost-effective. However, to truly democratize this technology and take full advantage of the AI revolution, data centre infrastructure needs to be upgraded.
According to the International Energy Agency, data centres accounted for about 1-1.3% of global electricity usage in 2023. With the integration of AI into various industries, this number is expected to increase by 50% by 2027 and 165% by 2030. This rapid growth will put significant pressure on existing data centre infrastructure, which must maintain power levels and prevent overheating.
While most data centres currently rely on traditional air-cooling systems, the increasing cooling demands have prompted the rise of liquid cooling as a more efficient and sustainable solution, poised to become the preferred choice for data centres globally.
Challenges Faced by Data Centres Due to AI
The use of AI for training and inferencing tasks requires massive amounts of energy. As data centres become more involved in AI processes, the strain on graphic processing units (GPUs) intensifies, leading to complex computations.
AI training involves analyzing datasets and adjusting parameters, requiring intense, sustained GPU utilization for weeks. This process can put the system at risk of overheating. Additionally, inferencing applies trained models to real-world data, relying on GPUs for low-latency tasks like autonomous driving or medical imaging.
The increased energy consumption can cause data centres to exceed recommended operating temperatures, leading to excessive wear and tear, costly maintenance, and shortened GPU lifespans, ultimately impacting the effectiveness of data centres.
The Advancement of Cooling Technology
As AI technology continues to evolve, innovations in data centre cooling systems are being driven by the computational demands of AI. Data centres using air cooling systems spend approximately 40% of their energy on cooling alone. Liquid cooling presents a more efficient solution, with a 15% increase in Total Usage Effectiveness and a 10% decrease in total power usage.
Liquid cooling can dissipate more heat and maintain the efficiency of high-density computing systems better than air cooling. Data centre operators can focus cooling efforts on specific CPUs and GPUs, optimizing thermal management and reducing the space required for cooling equipment.
While liquid cooling offers numerous benefits, it also comes with its challenges, particularly in maintenance. Unlike air-cooling systems, liquid cooling systems require intricate plumbing and careful coolant management to ensure optimal performance.
Investing in liquid cooling and its maintenance can be beneficial, as each dollar spent on liquid cooling hardware typically incurs an annual upkeep cost of $0.30 to $0.50.
The Future of AI and Liquid Cooling
AI’s demand for computational power continues to grow. As OpenAI and other organizations push the boundaries of GPUs, liquid cooling is not just an option but a crucial component for sustained AI growth.
Advanced technology not only reduces energy consumption but also extends GPU lifespan and enhances processing speeds, paving the way for further innovation. In the AI-driven future, maintaining optimal cooling isn’t just advantageous—it’s essential for survival.