top of page
Search

Powering the AI Race: The Energy Crises Demanding the World's Attention in 2026

“Global electricity demand from data centers is set to more than double over the next five years, consuming as much electricity by 2030 as the whole of Japan does today” – Dr Faith Birol, Executive Director, International Energy Agency (IEA), April 2025


AI this, AI that. For a generation inheriting both the promise of AI and the consequences, it's not optional but essential to understand this collision.


Every time you ask ChatGPT to help you with your math homework, generate cat pictures, a data center somewhere in the world lights up. Behind every interaction between you and the AI is a vast network of servers, cooling systems, and power infrastructure consuming electricity at a scale most people never stop to consider. I mean, it's understandable to a certain extent; all I’d ever want from my AI is to help me with my assignments so I can spend the rest of my day relaxing.


While doing research for this topic, it occurred to me, or even made me feel guilty for a second, how costly a single search is to the environment. If you are human, like me, I'd like to assume caring about the environmental consequences of AI wasn’t the first thought such a dubious citizen would have. However, it's important our generation understands the long-term consequences of such, which the article aims to clear up, as our generation, with the given tools we have, are on the frontlines of solving this problem as AI moves from a niche technology to a foundational part of modern lives and the energy it demands becomes one of the most urgent challenges of our time.


What is Artificial Intelligence (AI) and why does it matter?


AI refers to computer systems designed to perform tasks that traditionally require human intelligence, so basically things like understanding language, recognizing patterns, making decisions, and generating creative content. Models like ChatGPT, Perplexity (GOAT), Claude, and Gemini are now used to automate customer service, medical diagnostics, financial analysis, and scientific research at a scale never done before.


Global data center investment nearly doubled since 2022, reaching half a trillion dollars in 2024. Microsoft, Google, Amazon, and Meta collectively committed over $320 billion in AI infrastructure spending in 2025 alone. Its implications are effective in industries like energy, where optimizing grid management and predicting infrastructure failures before they occur. Its potential is enormous, but so is the price of admission.


Why does it need so much energy?


There are two phases for this:


  • Training: Building an AI model from scratch requires processing across massive databases, which requires continuous computation. Research cited by Deloitte found that training LLMs (Large Language Models) with more than 175 billion parameters takes between 324 MWh and 1,287 MWh of electricity per training run. This is just one run, but models are frequently retrained as they are updated and improved, so the number of megawatts used adds up quickly.

  • Inference: Every query, generated image, and recommendation draws a fresh burst of computing power. Deloitte estimates that a single generative AI prompt request consumes 10 to 100 times more electricity than a standard internet search. At billions of queries per day, this adds up at a staggering rate.


The hardware powering these systems has become increasingly power hungry. Traditional CPUs ran at 150–200 watts per chip. By 2023, the state-of-the-art AI GPUs ran at 700 watts, and 2024’s next-generation chips reached 1200 watts, with the average power density per server rack expected to rise from 36 kW in 2023 to 50 kW by 2027. Long story short, it requires a lot of energy and doesn’t seem like it is slowing down.


Source: IEA (2025), AI to drive surging electricity demand in data centers.


Ok, so training AI requires vast amounts of energy so that its inference can answer all your queries—you get it at this point. AI’s inference (asking your queries) can be done pretty much anywhere you have a device, but you must be wondering where all the training that makes this inference possible comes from. That's a smooth link to the concept of data centers.


What are data centers?


Data centers are the physical spaces where the computers behind digital services are kept, along with the power, cooling, and networking they need to operate reliably. In the world of AI, because these models require enormous

computing power to train and run, data centers are the infrastructure that quietly supports the whole system.


So, basically, data centers are where these models are trained so that AI can use inference to answer my useless 3 a.m. thoughts. This must be such a wonderful infrastructure, right? But for the planet? Not so much. Globally, data centers currently consume approximately 415 TWh of electricity annually, around 1.5% of global electricity consumption. Deloitte projects that this could double to 1065 TWh by 2030 if efficiency improvements keep pace and rise above 1300 TWh if they do not.


Source: Deloitte, Generative AI’s power demand could double data center consumption by 2030.


So, we understand that data centers are using up plenty of energy.


What's so costly after all?


  • Straining electricity grids: In northern Virginia, the world’s largest data center market anticipates power demand to grow 85% over 15 years, with data center demand quadrupling. The consequence of this, since the energy is going to data centers and due to the economic problem of scarcity, increases the existing price of electricity, which increases household electricity costs by 8% by 2030.

  • Carbon emissions: Despite net zero pledges, AI infrastructure expansion is increasing near-term emissions. Google, Meta, and Microsoft all reported significant emission spikes directly tied to data center growth. Coal and natural gas (non-renewable energy = main contributor to pollution) together still meet over 40% of new data center energy demand through 2030, according to the IEA.

  • Water consumption: AI data centers are extraordinarily water intensive. A single hyperscale facility using air-based cooling can require over 50 million gallons of water annually. This is water that cannot be returned to its source, so total freshwater demand from AI data centers could reach 1.7 trillion gallons by 2027.


Well, some governments are responding to this. Ireland, Amsterdam, and Singapore have all moved to restrict or regulate new data center construction. These are early signals of a regulatory environment that will only tighten as demand grows.


So, what solutions have been implemented, worked, failed, and what are some key takeaways?


Renewable energy PPA (Purchasing Power Agreements) — the tech sector accounts for 68% of contracted corporate clean energy capacity globally, bankrolling wind and solar projects that would otherwise not be built. This is a plausible solution, but renewables alone cannot guarantee the 24/7 reliability that AI data centers require:


  1. More efficient chips: Nvidia’s Blackwell GPU trains the same AI model using 73% less energy than its predecessor. Hardware efficiency gains are among the most impactful near-term levers, although Jevon’s Paradox warns that efficiency gains tend to be outpaced by rising demand.

  2. Liquid cooling: Traditional air-based cooling in data centers consumes up to 40% of a data center's electricity. Liquid cooling can reduce this by up to 90%. European data centers are already recycling waste heat to warm nearby buildings.

  3. Small modular reactors (SMRs): Microsoft, Google, and Amazon have signed SMR agreements, betting on compact nuclear power as a 24/7 carbon-free power source. The IEA projects SMRs will enter the data center energy mix post-2030. Promising, but not a near-term fix.

  4. Smarter AI models: Do we really need trillion-parameter models for every application? Think about it; smaller, targeted models and on-device AI processing can deliver substantial value at a fraction of the energy cost. The IEA estimates that scaling AI-led efficiency tools could deliver 300 TWh in global electricity savings, which is equivalent to Australia’s entire annual output.


Efforts globally are underway; however, how could I contribute as a student?


The renewable energy space provides an exciting opportunity, as the World Economic Forum projects 170 million new jobs to emerge by 2030, with green technology and AI among the fastest-growing sectors. PwC’s 2025 AI Jobs Barometer finds workers in AI-exposed roles earn 56% higher on average than non-AI peers, with skill sets evolving 68% faster.


Some of the key roles students can look into are:


  • Sustainable AI infrastructure — designing energy-efficient data centers and cooling systems for the hyperscale industry

  • Climate data science — using machine learning to model environmental risks and optimise renewable energy production

  • Energy systems engineering — applying AI to smart grid management and the integration of variable renewables

  • ESG and sustainability strategy — bridging technical sustainability challenges and corporate strategy

  • Greentech entrepreneurship — building the next generation of solutions in liquid cooling, AI-optimized grids, and sustainable chip design


We don’t have to wait until graduation to work towards sustainability. Personally, I’m taking foundational AI literacy and green tech courses on Coursera. By combining technical and environmental disciplines, we can follow where the capital is flowing in green tech.


The sustainable AI infrastructure space is an opportunity of a lifetime to solve challenges that are most prominent for our planet while also being paid handsomely.


The current economic standpoint we are in right now is similar to the emergence of social media—a chance for young entrepreneurs who want to protect future generations through building student-focused ventures by attracting serious early-stage investment.


The energy crisis demanding the world’s attention is also an open invitation. So, what do you plan to do this?


Bibliography


Baraniuk, C. (2024) 'Electricity grids creak as AI demands soar', BBC, May 2024. Electricity grids creak as AI demands soar


Brousard, N. (2024) 'Examining the impact of chip power reduction on data center economics', Semiconductor Engineering, March 2024. Available at: Examining The Impact Of Chip Power Reduction On Data Center Economic


Carbon Brief (2025) 'AI: Five charts that put data-centre energy use and emissions into context'. AI: Five charts that put data-centre energy use – and emissions – into context


de Vries, A. (2023) 'The growing energy footprint of artificial intelligence', Joule, 7(10), pp. 2191–2194.


Deloitte (2024) 'As generative AI asks for more power, data centers seek more reliable, cleaner energy solutions'. As generative AI asks for more power, data centers seek more reliable, cleaner energy solutions


Hao, K. (2024) 'AI is taking water from the desert', The Atlantic, March 2024. AI Is Taking Water From the Desert


IEA (2025) 'AI is set to drive surging electricity demand from data centers'. AI is set to drive surging electricity demand in data centers


Jones Lang LaSalle (2024) 'Data Centers 2024 Global Outlook', January 2024. Available at: 2026 Global Data Center Outlook


PwC (2025) AI Jobs Barometer 2025. AI Jobs Barometer


Shah, A. (2024) 'Generative AI to account for 1.5% of world's power consumption by 2029', HPCwire, July 2024. Generative AI to Account for 1.5% of World's Power Consumption by 2029


S&P Global (2025) 'Global data center power demand to double by 2030 on AI surge'. Global data center power demand to double by 2030 on AI surge: IEA


Studer, M. (2023) 'The energy challenge of powering AI chips', Robeco, November 2023. Available at: The energy challenge of powering AI chips | Robeco Global


World Economic Forum (2025) Future of Jobs Report 2025. The Future of Jobs Report 2025 | World Economic Forum


Birol, F. (2025) cited in IEA, Energy and AI, April 2025. Energy and AI – Analysis - IEA

 
 
 

1 Comment


Nice! Well writen with great flow!!

Like
bottom of page