AI, but at What Cost? Breakdown of AI’s Carbon Footprint
Is it efficient? Is it green? Far from it.
Artificial Intelligence is exciting—the technological breakthrough that took over our lives shows no signs of slowing down.
Acknowledging the problematic copyright issues that stem from AI usage, I cannot deny that it is a powerful tool when placed in the right hands for the right purpose—but with great power comes great responsibility.
Amid the excitement and tech moguls' gold rush to the next model, have we stopped to consider the environmental cost of running these models—with their hundreds of billions of parameters—for hours on end, scaled across multiple instances, serving millions of users simultaneously?
As a long-time advocate of the "minimal web" concept, I've noticed an oxymoron: On one hand, all of us are eager to run towards technological advancements (e.g. AI). On the other hand everyone is concerned about the planet's health—But only few consider the conflict between the two: the massive energy costs of AI—is it efficient? Is it green? Far from it.
AI Energy Consumption
Can we calculate the energy cost of AI models available as services online? The truth is—we can’t, not without access to their datacenters. But—we can come to a pretty realistic conclusion.
Using my older GPU (equivalent to a GTX 1060), it took 10 minutes to generate an image using Stable Diffusion. My online research showed that the Nvidia A100 is the most commonly used GPU for AI/ML in the cloud, which consumes up to 400W under load.
The Nvidia A100 is a very powerful GPU designed specifically for AI/ML execution. Based on online benchmarks I’ve found, it can generate an image using Stable Diffusion in under 5 seconds.
Using the formula E = P * T, running an Nvidia A100 at load for 5 seconds consumes approximately 0.5 watt-hours.
Midjourney and similar AI image generation services typically generate 4 images per prompt. Assuming these services use an array of Nvidia A100s, we can estimate they consume 2 watt-hours of energy per prompt, per user. As of January 2025, Midjourney has nearly 20 million daily active users.
2Wh x 20,000,000 Users x 24 Hours = 960,000,000Wh = 960,000KWh
Based on these calculations, services like Midjourney consume around 960,000 kWh per day, assuming each user executes one prompt per hour in average. In reality, the actual energy consumption could be higher.
This amount of energy alone could power over 25,000 average-sized households per day.
Keep in mind that our example only touches image generation services. Let's not forget about other AI services like ChatGPT, Gemini, Claude, Deepseek—and all their versions and variations. The list, and energy waste, continues to grow.
How can we, as developers, researchers, and entrepreneurs, collaborate to address this emerging issue? What steps can we take to promote efficient and responsible energy use when it comes to AI? I'd love to hear your thoughts on this challenge and potential solutions in the comments below.