Efforts to improve energy efficiency in high-performance computing for a Sustainable Future

The demand for high-performance computing (HPC) is increasing, leading to higher energy consumption and environmental concerns. LUMI, the most powerful supercomputer in Europe, highlights the need for improved efficiency in the computing industry. Data centres currently account for 1.5% to 2% of global electricity consumption, expected to rise to 4% by 2030. Efforts to enhance efficiency are being made at the microchip, computer, and data centre levels. Chip manufacturers are employing various techniques, such as sensor monitoring and specialized chips, while system-level optimizations and innovative cooling methods are also being explored. These measures aim to reduce energy consumption and environmental impact without compromising performance.

Developing sustainable CO2

The demand for high-performance computing (HPC) has surged due to technological advancements like machine learning, genome sequencing, and simulations. However, the increasing energy consumption of HPC and the computing industry has raised concerns about sustainability and greenhouse gas emissions.

The Finnish supercomputer LUMI, powered by hydroelectricity, is an example of the future of high-performance computing. With tens of thousands of individual processors, LUMI can perform up to 429 quadrillion calculations per second, making it the third-most powerful supercomputer in the world. Notably, LUMI also has negative carbon dioxide emissions, as its waste heat is used to warm homes in Kajaani. This showcases HPC systems’ potential to contribute to computational power and environmental sustainability.

The article highlights the substantial energy consumption of the computing industry, particularly data centers, which currently account for around 1.5% to 2% of global electricity consumption. This percentage is projected to rise to 4% by 2030, requiring the industry to focus on energy efficiency and reducing greenhouse gas emissions.

Efforts to enhance energy efficiency occur at three levels: individual microchips, computers, and data centers. Chip manufacturers, such as AMD, use various techniques to minimize power consumption. These include sensors that monitor and minimize power usage based on task assignment and ensure that as much of the chip as possible is actively performing tasks to avoid wasted energy. AMD hopes to improve the efficiency of its most powerful chips by 30-fold by 2025.

Another approach to improving energy efficiency is shifting work from general-purpose CPUs to specialized chips designed for specific mathematical tasks. Graphics processing units (GPUs) have emerged as highly efficient chips for tasks that can be broken down into smaller units and worked on simultaneously. Additionally, system-level optimizations in the wiring and design of supercomputers can reduce energy consumption during signal transmission. The “dragonfly topology,” which connects chips in clusters and then to each other, has been identified as an efficient design for supercomputers.

Data centres play a crucial role in energy consumption and efficiency. Cooling data centres can be particularly energy-intensive, with modern CPUs or GPUs producing 500 watts or more of heat at full capacity. Liquid cooling and innovative cooling methods are being explored to improve heat transfer and reduce energy requirements. Frontier, a supercomputer at Oak Ridge National Laboratory, achieves a remarkable power usage effectiveness (PUE) ratio of 1.03 due to its liquid cooling system. LUMI, located near the Arctic Circle, takes advantage of the cool sub-Arctic air to reach a PUE of 1.02.

The article concludes that efforts to improve energy efficiency in high-performance computing systems are crucial to meet growing demand while reducing environmental impact. These efforts encompass advances at the microchip level, system-level optimizations, and innovative cooling methods. The examples of LUMI and Frontier demonstrate that efficiency gains can be achieved without compromising performance. By striving for greater efficiency, the computing industry aims to address sustainability concerns and contribute to a more environmentally friendly future.

Source: The Economist