IT Sustainability Think Tank: The energy challenge of AI datacentres in 2024 and beyond

IT Sustainability Think Tank: The energy challenge of AI datacentres in 2024 and beyond IT Sustainability Think Tank: The energy challenge of AI datacentres in 2024 and beyond

The soaring enterprise demand for AI services is causing all sorts of problems for datacentre operators and their sustainability goals, which will need addressing in 2025

By

  • Bob Johnson, VP analyst at Gartner

Published: 19 Dec 2024

The rise of generative AI (GenAI) is reshaping industries, but it’s also creating an unprecedented demand for energy.

Advertisement

Datacentres supporting AI workloads are expanding at a breakneck pace to meet the needs of increasingly complex large language models (LLMs).

However, this growth comes at a cost: by 2027, 40% of AI datacentres will face operational constraints due to power shortages, according to Gartner.

The issue isn’t just about availability. AI datacentres are forecast to increase electricity consumption by 160% within the next three years. Such a surge threatens to overwhelm utility providers, disrupt energy availability, and undermine sustainability goals as fossil fuel plants remain in operation longer to keep up with demand​.

Powering innovation or fueling a crisis?

The insatiable energy appetite of hyperscale datacentres is outstripping the ability of power grids to cope. AI models require immense computational power for training and operations, making 24/7 energy availability essential.

Yet renewable energy sources like wind and solar are not yet capable of providing the reliable baseline power needed without significant advancements in energy storage.

This mismatch between demand and supply has far-reaching consequences. Rising energy costs will drive up operational expenses for AI products and services, impacting organisations across industries.

Meanwhile, the concentration of datacentres in regions like Ireland and Singapore is already forcing local governments to limit their expansion due to power constraints​.

Sustainability at stake

The strain on energy grids is having a knock-on effect on sustainability goals. In the short-term, many datacentres will need to rely on fossil fuels, increasing their carbon footprints and delaying progress toward net-zero targets. While longer-term solutions like advanced battery storage or modular nuclear reactors hold promise, they are not yet viable at scale​.

For organisations committed to sustainability, this presents a dilemma. Balancing the deployment of energy-intensive GenAI applications with environmental responsibility requires innovative approaches, such as adopting smaller language models, leveraging edge computing, and collaborating with datacentre providers to optimise energy use.

Rethinking the future of AI

The current trajectory of AI-driven innovation poses hard questions for business and IT leaders. How to sustain growth in the face of energy constraints? What strategies will mitigate the impact of rising costs and environmental pressures?  Organisations must prioritise efficiency in AI workloads, re-evaluate sustainability goals, and actively support the development of greener energy alternatives like clean hydrogen and small nuclear reactors​.

As the demands of GenAI reshape the global energy landscape, success will require more than just technological prowess. It will demand foresight, collaboration, and a willingness to innovate sustainably.

Read more on Datacentre energy efficiency and green IT

Read More

Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Advertisement