MIT explores AI solutions to reduce emissions
Techniques such as optimising training cycles, using less energy-intensive processors, and algorithmic improvements could significantly lower AI’s carbon footprint.
Rapid growth in AI data centres is raising global energy use and emissions, prompting MIT scientists to cut the carbon footprint through more intelligent computing, greater efficiency, and improved data centre design.
Innovations include cutting energy-heavy training, using optimised or lower-power processors, and improving algorithms to achieve results with fewer computations. Known as ‘negaflops,’ these efficiency gains can dramatically lower energy consumption without compromising AI performance.
Adjusting workloads to coincide with periods of higher renewable energy availability also helps cut emissions.
Location and infrastructure play a significant role in reducing carbon impact. Data centres in cooler climates, flexible multi-user facilities, and long-duration energy storage systems can all decrease reliance on fossil fuels.
Meanwhile, AI is being applied to accelerate renewable energy deployment, optimise solar and wind generation, and support predictive maintenance for green infrastructure.
Experts stress that effective solutions require collaboration among academia, companies, and regulators. Combining AI efficiency, more innovative energy use, and clean energy aims to cut emissions while supporting generative AI’s rapid growth.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!