· 2 min read
LinkedIn post for The Sustainable AI Paradox
AI has a carbon footprint. If you work in sustainability, you already feel the tension: deploying AI to track emissions while generating emissions.
Here’s what the numbers actually say:
A single ChatGPT query: ~0.34 Wh. Comparable to a Google search. A complex reasoning query: up to 50x more. Training GPT-4: ~50 GWh. Global data centers: 415 TWh in 2024, projected to double by 2030.
Water matters too. A 100-word AI response costs roughly 500ml of cooling water.
And carbon depends on where you run it. The same model trained on France’s nuclear grid: 25 tCO2e. On the US average grid: 502 tCO2e. A 20x difference.
But here’s the other side of the ledger:
The IEA (independent, not an AI company) estimates AI’s emissions reduction potential at 1,400 Mt CO2 by 2035. AI’s own data center emissions: 300-500 Mt. That’s a 3-4x net positive.
The concrete wins are already here. DeepMind cut data center cooling energy by 40%. BrainBox AI saved 7.98M kWh across 600 stores in a year. AI compressed battery research from 500+ days to 16 days.
The question isn’t whether AI has an environmental cost. It does. The question is whether you deploy it deliberately, track its impact, and direct it toward outcomes that exceed that cost.
Five things you can do now:
- Right-size models (MoE uses ~1/3 the energy of dense models)
- Track your footprint (CodeCarbon, SCI for AI standard)
- Choose low-carbon compute regions (10-20x variation)
- Optimize inference (it dominates energy at scale)
- Connect your FinOps to GreenOps (cost tracking is carbon tracking)
I wrote up the full analysis with all the sources. Link in comments.
#sustainability #AI #ESG #greencomputing #carbonfootprint #FinOps

