AI's Energy Demands: A Double-Edged Sword

AI's Energy Demands: A Double-Edged Sword

The rise of generative AI is driving energy demand, raising concerns about environmental impact and sustainability in the tech industry.

Jesse Anglen
July 19, 2024

looking for a development partner?

Connect with technology leaders today!

Schedule Free Call

Right now, generative artificial intelligence is impossible to ignore online. An AI-generated summary may randomly appear at the top of the results whenever you do a Google search. Or you might be prompted to try Meta’s AI tool while browsing Facebook. This rush to add AI to as many online interactions as possible can be traced back to OpenAI’s boundary-pushing release of ChatGPT late in 2022. Silicon Valley soon became obsessed with generative AI, and nearly two years later, AI tools powered by large language models permeate the online user experience.


One unfortunate side effect of this proliferation is that the computing processes required to run generative AI systems are much more resource intensive. This has led to the arrival of the internet’s hyper-consumption era, a period defined by the spread of a new kind of computing that demands excessive amounts of electricity and water to build as well as operate. “In the back end, these algorithms that need to be running for any generative AI model are fundamentally very, very different from the traditional kind of Google Search or email,” says Sajjad Moazeni, a computer engineering researcher at the University of Washington. “For basic services, those were very light in terms of the amount of data that needed to go back and forth between the processors.” In comparison, Moazeni estimates generative AI applications are around 100 to 1,000 times more computationally intensive.


The technology’s energy needs for training and deployment are no longer generative AI’s dirty little secret, as expert after expert last year predicted surges in energy demand at data centers where companies work on AI applications. Almost as if on cue, Google recently stopped considering itself to be carbon neutral, and Microsoft may trample its sustainability goals underfoot in the ongoing race to build the biggest, bestest AI tools. “The carbon footprint and the energy consumption will be linear to the amount of computation you do, because basically these data centers are being powered proportional to the amount of computation they do,” says Junchen Jiang, a networked systems researcher at the University of Chicago. The bigger the AI model, the more computation is often required, and these frontier models are getting absolutely gigantic.


Even though Google’s total energy consumption doubled from 2019 to 2023, Corina Standiford, a spokesperson for the company, said it would not be fair to state that Google’s energy consumption spiked during the AI race. “Reducing emissions from our suppliers is extremely challenging, which makes up 75 percent of our footprint,” she says in an email. The suppliers that Google blames include the manufacturers of servers, networking equipment, and other technical infrastructure for the data centers—an energy-intensive process that is required to create physical parts for frontier AI models.


Despite an upwards trend of energy needs at data centers, it’s still a small percentage of the amount of energy humans use overall. Fengqi You, an energy systems engineering researcher at Cornell, mentions oil refineries, buildings, and transportation as more impactful at the present moment. “Those sectors use much more energy compared to AI data centers right now,” he says. Keeping that in mind, AI’s energy consumption footprint could continue to grow in the near future, as generative AI tools are integrated into more corners of the internet and adopted by more users online.


Amongst the many energy-hungry technologies supporting modern society, artificial intelligence (AI) is emerging as a major driver of energy demand. Data centers—the physical infrastructure enabling AI—are becoming larger, multiplying, and consuming more energy. Environmental organizations such as Greenpeace are concerned that this will jeopardize decarbonization efforts and halt progress in the fight against climate change. AI can track melting icebergs or map deforestation, all the while consuming excessive amounts of carbon-intensive energy. But a closer look at the data shows that fears of AI’s insatiable appetite for energy may be unwarranted.


If we take reports at face value, we might conclude that AI-induced climate stress is all but inevitable. Niklas Sundberg, a board member of the nonprofit SustainableIT.org claims that a single query on ChatGPT generates 100 times the amount of carbon as a Google search. The International Energy Agency predicts that global energy demand from data centers, cryptocurrency, and AI will double by 2026. Even the U.S. government believes that AI will exert a major influence on society: the Department of Homeland Security announced the first 10 hires for a newly formed AI Corps, to provide advice on how best to use AI within the federal government. The Department of Energy has created a working group on the energy needs of data center infrastructure and is talking to utilities on how to meet energy demand. AI’s energy demands, according to the Bloomberg Energy Daily, are “a source of trepidation.”


Climate activists have raised the alarm. Greenpeace is calling for an official emissions tracking system to quantify AI’s environmental impacts. Climate researcher Sasha Luccioni proposes that governments restrict AI’s energy use, including by the use of AI “sobriety" measures or a carbon tax to deter electricity consumption. Vox warns that the benefits of the modern world—while substantial—come with tradeoffs and that none of these trade-offs is as important as energy: “As the world heats up toward increasingly dangerous temperatures, we need to conserve as much energy to lower the amount of climate-heating gasses we put into the air.” AI’s energy consumption has become yet another way to push back at a high-energy planet.


But a closer look reveals a more complex relationship between AI use and energy demand, energy efficiency, and decarbonization that isn’t all bad news. First, there is the question of whether businesses are using AI. With data from the US Census Bureau, Guy Berger at the Burning Glass Institute shows that the two most common applications of AI are marketing automation (2.5% of US businesses) and virtual agents/chat bots (1.9% of businesses). Only 1% of businesses have used large language models. Berger’s analysis also shows that labor-saving uses of AI are still quite small, with 1 out of every 4 businesses using AI to perform a few tasks that were previously carried out by humans. And the largest businesses are most likely to say that they are using AI but also the most likely to say that they don’t know if they are using AI. Of course, the use of AI will increase in the future, but for now, it seems mostly confined to a few sectors and activities.


But beyond the eye-catching statistics, estimates of energy consumption are difficult to find, in part because industry data are heavily guarded and researchers have to rely on overly simplistic extrapolations. The Goldman Sachs Group estimates that AI power demand in the UK will rise 500% over the next decade. U.S. data centers could account for 8% of total electricity needs by 2030, up from 3% in 2022. States that house data centers appear to be running out of power. The Boston Consulting Group comes up with similar numbers—electricity consumption by data centers is projected to reach 7.5% of total US electricity consumption by 2030, up from about 2%. Generative AI is expected to contribute at least 1% to this growth. Rystad Energy says that data centers and AI energy use will increase by 177 TWh, reaching 307 TWh by 2030.


In a detailed thread on X, MIT Innovation Fellow and former National Economic Council director Brian Deese argues that forecasters consistently overestimate electricity demand, in part because they emphasize static load growth over efficiency gains. Deese points out that in the early 2000s, analysts predicted surging electricity demand. Instead, U.S. electricity demand has stayed flat for two decades. And although data center energy use is increasing, energy intensity (energy use per computation) has decreased by 20% every year since 2010. Nvidia—one of the largest companies designing graphics processing units (GPUs) for gaming, professional visualization, data centers, and automotive markets—is continuously improving the energy efficiency of its GPUs. Its new AI-training chip, Blackwell, for example, will use 25 times less energy than its predecessor, Hopper. Deese points out that analysts may be double-counting energy use by data centers because technology companies initiate multiple queries in different utility jurisdictions to get the best rates.


A (carbon-heavy) query to ChatGPT suggests AI and data service providers have considerable room to improve the energy efficiency of data center infrastructure using various measures: Virtualization and Consolidation: Virtualization technology can be used to consolidate servers and reduce the number of physical machines running. This can lead to significant energy savings by optimizing server utilization rates. Efficient Cooling Systems: Cooling accounts for a substantial portion of a data center's energy consumption. Implementing efficient cooling techniques such as hot/cold aisle containment, using free cooling when ambient temperatures allow, and employing modern cooling technologies like liquid cooling can reduce energy usage. Energy-Efficient Hardware: Energy-efficient servers, storage devices, and networking equipment can be a priority, as can the use of products with high energy efficiency ratings (such as ENERGY STAR certified devices), with use configurations optimized for lower power consumption.


The artificial intelligence boom has driven Big Tech share prices to fresh highs, but at the cost of the sector’s climate aspirations. Google admitted on Tuesday that the technology is threatening its environmental targets after revealing that data centers, a key piece of AI infrastructure, had helped increase its greenhouse gas emissions by 48 percent since 2019. It said “significant uncertainty” around reaching its target of net zero emissions by 2030—reducing the overall amount of CO2 emissions it is responsible for to zero—included “the uncertainty around the future environmental impact of AI, which is complex and difficult to predict”.


It follows Microsoft, the biggest financial backer of ChatGPT developer OpenAI, admitting that its 2030 net zero “moonshot” might not succeed owing to its AI strategy. So will tech be able to bring down AI’s environmental cost, or will the industry plough on regardless because the prize of supremacy is so great? Data centers are a core component of training and operating AI models such as Google’s Gemini or OpenAI’s GPT-4. They contain the sophisticated computing equipment, or servers, that crunch through the vast reams of data underpinning AI systems. They require large amounts of electricity to run, which generates CO2 depending on the energy source, as well as creating “embedded” CO2 from the cost of manufacturing and transporting the necessary equipment. According to the International Energy Agency, total electricity consumption from datacentres could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to the energy demand of Japan, while research firm SemiAnalysis calculates that AI will result in datacentres using 4.5 percent of global energy generation by 2030. Water usage is significant too, with one study estimating that AI could account for up to 6.6 billion cubic metres of water use by 2027—nearly two-thirds of England’s annual consumption.


A recent UK government-backed report on AI safety said that the carbon intensity of the energy source used by tech firms is “a key variable” in working out the environmental cost of the technology. It adds, however, that a “significant portion” of AI model training still relies on fossil fuel-powered energy. Indeed, tech firms are hoovering up renewable energy contracts in an attempt to meet their environmental goals. Amazon, for instance, is the world’s largest corporate purchaser of renewable energy. Some experts argue, though, that this pushes other energy users into fossil fuels because there is not enough clean energy to go round. “Energy consumption is not just growing, but Google is also struggling to meet this increased demand from sustainable energy sources,” says Alex de Vries, the founder of Digiconomist, a website monitoring the environmental impact of new technologies.


READ MORE BLOGS FROM OUR WEBSITE https://www.rapidinnovation.io/blogs


Drive innovation with intelligent AI and secure blockchain technology! Check out how we can help your business grow:


Blockchain App Development


AI Software Development


Discover how our expertise can help your business grow and thrive in today's digital landscape!


Top Trends

Latest News

Get Custom Software Solutions &
Project Estimates with Confidentiality!

Let’s spark the Idea