AI Operations & Administration

The Hidden Environmental Costs of AI

Last weekend I sat down with my kids and watched Pixar’s "Wall-E.

For those of you who haven’t seen it… a lone little robot toils endlessly to clean up the remnants of a future Earth ravaged by pollution and waste.

I’d describe it as a “cute-yet-subtly-dystopian” cartoon, with a cautionary tale for all of us.  

As I watched the movie, it made me think about some reading I’ve been doing lately about how big tech is training and using AI (specifically Large Language Models) and the effects these LLMs are having on our natural resources.

Specifically; training and using LLMs require lots (like, really lots) of electricity to run the data centres, and also water to keep them cool.

It’s not something that’s being talked about much in the media, but I think that as part of responsible, well-rounded discussion about generative AI, we should be honest that our pursuit of AI advancement does come with a steep environmental cost.

Inside the belly of the beast

I recently watched a Youtube tour of the world’s largest data centre; cavernous halls filled with massive banks of servers. 

The immense energy consumption required to build and deploy these increasingly powerful data centres that run products like Youtube, and modern LLM models is mind blowing.

Watching it is super impressive on the one hand, but on the other hand I couldn't shake the feeling that in building and running these data centres, we're only adding to the environmental crisis that we’re already pretty deeply entrenched in. 

THE HIDDEN ENVIRONMENTAL COSTS OF AI"Dammit Wayne, you've been watering the AI too much again" 

AI’s carbon footprint

Creating/training a single large AI model like GPT-3.5 (what powers the current free version of ChatGPT) can generate data centre emissions equivalent to driving 123 cars for a year, and that's before it even sees the light of day as a usable product. 

As LLM chatbots and image generators become ubiquitous over the next generations, weaving themselves into the fabric of our lives, the energy costs associated with running these AIs could skyrocket. 

I wonder whether we are prepared to confront the possible environmental consequences of what is starting to look like an insatiable appetite for AI innovation?

We’re certainly not talking about it openly enough. I'm sure about that.

The complexity of AI's environmental impact

The carbon footprint of AI development isn’t an easy one to summarise. It’s a complex and multifaceted set of issues, and I’ll admit the deep nuances of it are beyond me. 

However, there is some great research being released, and the best research tends to use some good analogies that the layperson can understand…

For example, a 2018 study by researchers at the University of Massachusetts Amherst revealed that training a single large AI model can emit nearly 626,000 pounds of carbon dioxide – the equivalent of five average American cars' lifetime emissions. 

Given that dozens, if not hundreds of LLM models are being trained around the world right now, this is not an insignificant effect we’re starting to have on our natural resources in our drive to build advanced AI models.

As AI becomes more sophisticated and widespread, the cumulative impact of numerous models deployed by thousands of companies could be very serious for our planet. 

The energy and water required to power data centres, run computations, and cool servers all contribute to the growing carbon footprint of AI. 

I think we have to ask ourselves: is this a legacy we want to leave as we develop AI?


Glimmers of hope of efficient AI

I’m happy to say that amidst the bleak predictions, there is some hope. 

Researchers and industry leaders are busy exploring ways to mitigate the environmental impact of generative AI. 

It appears (although it’s early days) that by optimising AI model architectures and processors, we can make AI systems more efficient, reducing the energy and water required for training and inference (ie. how we use AI). 

Similarly the development of “greener” data centres offers another promising path forward. Tech giants like Google and Microsoft are investing heavily in renewable energy sources to power their AI infrastructure

By embracing solar, wind, and hydroelectric power, we could begin to chip away at the carbon emissions associated with AI development. 

Scheduling AI computation during times of abundant green energy is another piece of the puzzle. By aligning energy-intensive tasks with periods when renewable energy sources are at their peak, the tech giants could further reduce the environmental impact of AI. It's a small step, but every little bit counts.

But is it enough? (And, although slightly cynically, I would also ask if these big tech firms are genuinely interested in finding environmentally friendly solutions, or if they’re just greenwashing?)

The falling costs of AI: More good news

Investment firm ARK, known for its bold predictions, estimates that AI training costs are plummeting at a staggering rate of 75% per year

If those projections hold true though, a foundation LLM model that costs $100M to train today might only cost $25M next year. 

Additionally, inference costs for enterprise-scale use cases are falling even faster, at an annual rate of around 86%.

These falling costs could have profound (positive) implications for the amount of natural resources consumed by these models, if companies and their leaders are educated about their possible choices around AI and the environmental impacts.

Prioritising sustainability in AI development

As the demand for generative AI grows, I think collectively we have to prioritise efficiency and sustainability in its development and deployment. 

By encouraging big tech and research labs to publish the carbon footprints of their AI models, we could have better transparency and accountability about the AI models we have at our disposal. 

By giving consumers the information we need to make informed decisions, we could create a market demand for "greener" chatbots and AI services. 

In the end, as I continually say; it all starts with good education for the general public about AI.

With increased AI literacy, we all gain the awareness necessary to make informed AI choices.

The role of Governments, regulators and individuals

Of course Governments and regulatory bodies have a critical role to play in promoting sustainable AI practices.

By establishing guidelines and incentives for energy-efficient AI development, they could encourage innovation while minimising the environmental impact. 

Collaborations between industry, academia, and policymakers could foster the development of best practices and standards for sustainable AI. 

But as individuals, we also have agency and responsibility to shape the future of AI. 

By being mindful of the AI services we use and supporting companies that prioritise sustainability, we can send messages that responsible innovation matters.

Collectively we can advocate for greater transparency and accountability in the AI industry, pushing for the disclosure of carbon footprints and the adoption of greener practices. 

The dual nature of generative AI

The rise of generative AI is both exhilarating and terrifying.

In my public talks and client work, I often say that I spend most days straddling a line between amazement and anxiety at the power of AI, and the pace it’s developing at.

And, just like the “cute-yet-subtly-dystopian” story of "Wall-E," I think we have to be honest with ourselves that as we build increasingly powerful (and useful) AI models, a disregard for sustainability could have serious consequences for us all.

What’s the path forward?

The path forward is (reasonably) clear, but I wouldn’t say it’s easy. 

By being honest and spreading the word about the energy consumption of generative AI, we could work towards a future where the benefits of this technology can be realised without compromising our planet's well-being. 

Through a combination of technical innovations, policy initiatives, and individual actions, we could ensure that the rise of generative AI is accompanied by an unwavering commitment to sustainability.

If we so choose.

The stakes are pretty high. 

At the risk of sounding grandiose; the AI choices we make today will have echoes for generations to come. 

Got something to add? Chime in below...