The linked article is a good and accessible summary of where we stand on AI’s energy footprint. The tl;dr is that AI’s current energy footprint is modest (comparable to streaming video). But demand is growing fast, reasoning models use 10–100x more energy than basic queries, and efficiency gains keep getting reinvested into more capability rather than saved. And what electricity powers the data centers is a much bigger question: Clean grid = net climate okay. Gas/coal grid = real problem.
Stop feeling guilty about prompts. Your Wh per query is not the lever that matters. You’ll do more climate good by eating one less steak, taking one fewer flight, or voting for better energy policy than by boycotting LLMs. What matters at the individual level is where you direct your attention. Demand the acceleration of the deployment of clean generation to meet data center demand; grid interconnections, nuclear licensing, transmission lines, and permitting reform are the bottleneck, not GPUs.