5 Things AI Companies Won’t Tell You About Energy Consumption

First and foremost

3600 Joules = 1 Wh (Watt hour)

1,000 Wh = 1 kWh

1,000 kWh = 1 MWh

1,000 MWh = 1 GWh

If you believe the companies who own the models, a single question posed to an AI is the equivalent of turning a light on for 1 minute (0.34 Wh). So why do they each need a 1 GW data centre powered by their own power plants? If we take their 0.34 Wh value, each company could serve 3 billion concurrent users. What’s more, each company plans on upscaling their data centres to 5 GW in the coming years. As a practitioner, I know, each of these companies struggle to serve all their users and are imposing increasingly stricter limits on how many questions customers can ask. Further, their concurrent number of users is on the order of 100’s of millions not billions. Something doesn’t add up.

Here's what is not disclosed:

1.        0.34 Wh is best-case scenario for a very lightweight, non-thinking, text model giving 1 answer. A lot of the state-of-the-art models are large thinking models that produce multiple answers to provide a single answer and image/video generating models are orders of magnitude more energy intensive.

2.        0.34 Wh is only the GPU (graphical processing unit) that performs AI specific calculations. Other data centre components require power including: storage, orchestration, networking, water cooling, climate control, lighting and sensors.

3.        Idle infrastructure needs to be “kept warm” and draw power when not in use. Models are operated in a surging pattern, so AI companies have to provision for their highest demand and keep their non-utilized hardware online during off-peak.

4.        Training costs are a lot more than inference (answering questions). A training run for GPT-4 is estimated to consume 60 GWh, a little bit more than 0.34 Wh.

5.        Hardware lifecycle is not taken into account. GPUs burn out fast when operating in this fashion, manufacturing, testing, and transportation are also a big contributor to costs.

Having said this, the technicalities as to what constitutes power consumption by AI is a moot point. All we need to do is to look at the numbers at a macro-scale. From 2005-2017 the electricity consumption of data centres remained flat. From 2017-2023 they have doubled. In 2024, data centre energy consumption was approx. 200 TWh (200,000 GWh), comparable to a small country. This is only projected to grow, with plans to triple the AI data centre capacity between 2024 and 2028, and this is only in the US. China is said to be scaling its AI infrastructure at a faster rate than the US. I have no doubt the responsibility of offsetting this consumption will be pushed to the consumers. The UK has already called for the population to delete their emails to save on data centre costs. Next you may be asked to stop saying “Hello” to your favourite AI, but more likely you’ll just be paying for it on your energy/AI subscription bill.

15 September 2025

Measure How Much Productivity You Could Gain With Our Calculator

Our productivity calculator reveals the potential costs Traffyk can save your business and improve  productivity by when inefficient workforce communication is reduced.