How much water and energy does training an AI model require?

The AI industry must focus on improving its energy efficiency, according to Tom Hallam, the CEO of CarbonTrail, a carbon accounting tool powered by AI. Hallam highlights the significant energy and water consumption in training AI models like ChatGPT.

 Nature, Night, Outdoors, Pattern, Art, Graphics, machine, Wheel, Crystal

According to Tom Hallam, founder, and CEO of CarbonTrail, an AI-powered carbon accounting tool, the AI industry needs to address its energy efficiency. Hallam emphasized that training AI models like ChatGPT requires significant energy and water. He explained that models like ChatGPT are trained using vast amounts of data, such as Wikipedia, which is transformed into probabilities to determine the likelihood of certain words following others. This process consumes massive amounts of energy and water, with a precursor to ChatGPT reportedly utilizing 1300 megawatts of energy and 3.5 million liters of water during training.

The training takes place in data centers equipped with servers containing GPUs, which generate heat and require water for cooling. For example, a US Google data center reportedly used 12 billion liters of water for cooling in 2021. Additionally, ongoing energy is used when using trained models to perform tasks. Hallam highlighted the importance of clean energy sources in minimizing carbon emissions associated with AI. He cited the significant difference in carbon dioxide emissions per kilowatt hour between New South Wales (648 grams) and New Zealand (120 grams) as an example. However, Hallam noted that the industry’s awareness of this issue is growing.