Generative AI is starting to make a real mark on our relationship with technology, from generating content to scheduling meetings to summarizing simple web searches. More and more people are experimenting with it or fully adopting it to support workflows or just for amusement, seeing it as a harmless shortcut or diversion. But is it harmless? In terms of resource use, the answer is arguably no.
Every act of computing requires electricity to power the end user device, the data transmission to and from servers and data centres, and the data centres that do the actual computation. That means every AI search, summary and text generation produces carbon emissions. While the International Energy Agency estimates that a standard Google search uses about 0.3 watt-hours (Wh) of electricity, a ChatGPT query uses about 2.9 Wh — almost 10 times as much. That same query also generates about 4.3 g of carbon dioxide equivalent, which may not sound like much, but it adds up fast. One million ChatGPT queries generate as much carbon as driving 17,700 km in your car or charging 349,258 smartphones. And when you consider that (a) ChatGPT alone handles more than a billion queries in a single day and (b) it’s only one of dozens of GenAI tools out there, the magnitude of the impact becomes staggeringly huge.
But that’s not even the whole story. The full environmental impact of GenAI is actually much larger, but it’s hard to determine for two main reasons. For one thing, the entire AI industry is largely unregulated, so there are no emissions standards they’re expected to meet and no requirement to report or disclose their energy-related data. At the same time, much of the industry’s impact comes from indirect effects beyond the queries themselves. Two of the largest contributors to indirect impacts are data centres and model training.
Data centres have always been heavy energy users, but the rise of GenAI has seen the number of data centres increase rapidly in recent years, with no signs of slowing down. These data centres, like most digital infrastructure, are full of hardware whose components are usually made of critical minerals (like lithium, graphite, nickel, cobalt, copper and rare earth elements), which are often mined unsustainably and in ways that damage the communities where the mines are located. Data centres also use massive amounts of water to keep the equipment contained within them cool, which is especially problematic for those located in places that are already hot and dry, like Arizona. They consume huge amounts of electricity, which is often generated by fossil fuels. And they produce substantial amounts of electronic waste as equipment is upgraded and replaced.
Training AI models is also incredibly energy-intensive. The process involves feeding data into a model and then developing, testing and optimizing algorithms using that data. GenAI models already involve billions of parameters. As the models grow in size and complexity with each new iteration, more and more energy will be needed to train them. The development cycle for new and improved GenAI applications is also getting shorter all the time, so the energy use required for model training is effectively constant and always expanding. On top of that, the training process doesn’t use energy at a consistent level, but rather involves rapid fluctuations in demand, which often require power grid operators to use supplemental generators — often fuelled by diesel — to absorb the spikes and keep the grid stable.
Despite these drawbacks, GenAI is not likely going anywhere anytime soon. If anything, it will continue to expand and produce corresponding growth in its impacts. By the end of 2026, GenAI-related electricity consumption could exceed that of a small country like Belgium, while its water consumption is projected to surpass that of Denmark by the end of 2027. But it is possible to mitigate some of those impacts. Some data centres are already making strides toward carbon neutrality — for example, in 2022, Google reached 64% carbon-free energy globally on an hourly basis, with its data centre in Finland operating on 97% carbon-free energy. The company is also working with Kairos Power to build 500 megawatts of advanced nuclear energy in Tennessee to provide emissions-free power to some of its US data centres.
We can encourage more movement in that direction by pressuring companies to use more renewable energy sources, as well as by lobbying governments to incentivize or require environmentally protective measures in data centres.
An even easier way to reduce the environmental impacts of GenAI? Simply use it less and instead turn to real people like the team at Ascribe to take care of your content writing needs. Contact us today to see how we can deliver original, compelling content — without the environmental damage.