“AI is not magic. AI is machines. And machines run on energy”

AI’s sustainability challenge is anything but theoretical, says Professor Bjorn Cumps, expert in Financial Services Innovation & FinTech at Vlerick Business School. “We’ve all embraced the idea that artificial intelligence offers smarter solutions, but all too often we remain blind to its hidden cost: a skyrocketing energy footprint. It’s time we start using these tools more consciously,” he says. As AI advances, so does its appetite for energy. According to Cumps, the only way to keep that growth sustainable is through smarter usage, greener infrastructure, and smarter regulation.
Should consumers be concerned about the energy behind AI?
Bjorn Cumps: “Certainly, because many end users consider artificial intelligence and cloud computing to be intangible. To be magic. But it’s not magic; it’s machines. And machines require energy. Every time you ask ChatGPT a question or generate an image, you’re triggering a huge amount of computing power behind the scenes. Especially with generative AI, you are not simply retrieving existing data, like with a Google search – you’re creating something new, which requires millions or even billions of calculations per prompt. Those calculations demand servers, cooling systems, and constant power.”
How steep is AI’s energy curve?
“The numbers are sobering. Every 100 days, the computing power required for AI doubles. Nvidia’s CEO recently predicted a 100-fold increase in demand over the next few years. And Google, which aimed to become net zero by 2030, saw its emissions increase by 50% since 2019, mainly due to AI. If you zoom out, this is not sustainable – at least, not at the pace we’re moving now.”
How does AI’s energy footprint compare to familiar digital activities like streaming or searching?
“It’s tricky, because we often compare apples to pears. If you watch a 4K video, yes, that’s energy intensive. But generating that 4K video with AI in the first place costs far more. Similarly, asking ChatGPT a question consumes 10 to 15 times more energy than a basic Google search. Multiply that by billions of prompts a day, and it adds up very quickly.”
Should we rethink how we use AI in everyday contexts, especially for non-essential tasks?
“Let’s say: we should think twice, without feeling guilty. If you’re generating images or videos ‘just for fun’, you’re using vast computing resources for low-impact activities. That’s like leaving all your lights on when you go to bed. We need to apply the same logic to digital tools. Banning usage is not what we should be doing, being conscious is. So, ask yourself questions like: ‘Do I really need to generate this?’, ‘Could I use a smaller model?’, ‘Is this the best use of energy?’ Etc.”
But don’t most people lack the technical knowledge to make that judgement?
“True, and that’s where education and design come in. Companies can do a lot more to inform users: pop-ups suggesting smaller models, basic eco-labels, even AI ‘budgets’ that show your consumption. These are forms of digital nudging, and they work. People won’t change unless they understand the impact of their behaviour and unless the tools guide them to better choices.”
Where do data centers fit into this picture?
“Data centers are on the frontline. They host the models, provide the infrastructure, and determine how green, or not, AI truly is. Their energy sourcing, cooling systems, and transparency practices are crucial. In Belgium, we already see leaders like LCL going beyond compliance and investing heavily in sustainable practices. That sets an example for others. But we need systemic pressure from regulators, from users, and from competition to raise the bar across the board.”
What role should governments and regulators play in curbing the environmental impact of AI?
“They can accelerate transparency requirements. Right now, most users have no idea what kind of infrastructure their AI prompt activates, or whether it’s powered by green or grey energy. Europe is leading the way on regulation, but even here, we’re not moving fast enough.”
Can individual user behaviour actually influence broader industry practices?
”Absolutely. In the end, we vote with our clicks. If enough people start choosing services that are more energy-aware, companies will follow. Think of the rise of organic food or electric vehicles: it started small, but demand changed the system. The same could happen here. Avoid overconsumption. Choose the efficient model. Ask for transparency. Don’t use a bazooka to swat a fly.”
Is the growing reliance on AI deepening the digital divide between those who can keep up and those who can’t?
“That’s already happening. On one end, you have companies and workers using AI to become vastly more productive. On the other, people whose jobs are being automated but who lack the skills to switch roles. It’s a question of access, but also of education, upskilling and support. AI could widen social inequalities if we don’t take the appropriate action.”
Do you see signs that organisations or business schools are beginning to take digital sustainability seriously?
“Yes, we’re seeing a growing awareness in business schools, among students, and even in policy circles. At Vlerick Business School, we now offer an entire Executive MBA track focused on digital sustainability. We’re helping professionals navigate these complex issues – how to use AI to boost productivity without causing environmental damage, and how to make sustainability a core part of digital transformation.
If there’s one principle you’d like people to adopt when it comes to AI, what would it be?
“Treat AI like electricity: incredibly useful when applied with purpose. Just as we don’t leave the lights or oven on unnecessarily, we shouldn’t overuse AI tools without reason. Use them where they truly add value and be mindful of when they don’t.”
Some revealing figures to help quantify the environmental impact of artificial intelligence:
Energy use
The following examples highlight the energy cost of inference, the actual use of AI models to generate responses. This does not include the much greater energy demands of training them.
- Asking a question to an AI model like ChatGPT can consume up to 10 times more energy than a typical Google search.
- Text-based tasks consume significantly less energy than generating images or videos.
- For 1,000 text inferences, the most efficient AI model uses the energy of just 9% of a smartphone charge.
- In contrast, the least efficient image model consumes the equivalent of 50% of a charge per generated image.
Water use
Training large language models (LLMs) comes with a significant water footprint, mainly for cooling the powerful data center infrastructure required.
- According to the University of California, training GPT-3 consumed at least 700,000 litres of water, with some estimates reaching 3.5 million litres.
- In July 2022, the month OpenAI announced GPT-4 had been trained, the company’s data centers in Iowa reportedly used 43.5 million litres. That’s equivalent to 17.5 Olympic swimming pools.
The scale of increase is staggering, highlighting how each new generation of ChatGPT comes with a rapidly growing environmental cost.
Looking for more insights? Discover similar articles below.