My question regrettably left out an unstated extrapolation I was inferring... what happens when we all "just google-ai it" when we're bored?
Of course I don't think the energy usage of your individual questions is an issue. I also appreciate your data gathering and reporting of it. I didn't mean to come across as critical of your personal use of AI to gather and report this data.
Basically: if something useful gets cheaper, people may end up using it way more such that the total energy usage increase is materially large.
I do think there's a risk of that happening here: an AI-assisted search is inevitably more expensive than a single web search (since it runs potentially dozens of additional searches on top of the LLM inference cost).
I could counter that and say that now I can resolve a complex question in 30s with a single typed prompt, where beforehand I might have spent 10 minutes running my laptop and running dozens of searches by hand, for a net increase in spent energy.
... but actually, for trivial curiosity questions I just wouldn't have bothered looking for an answer at all.
I value curiosity deeply though, so maybe it's worth the world burning a few more gigawatts if the result is a more curious population?
I can accept that these questions are more intense than simpler prompts - running dozens of prompts in a chain to answer a single question.
Best estimates I've seen are that a single prompt is equivalent to running an oven for a few seconds.
I'm OK with my curiosity running an oven for a full minute!
Here are my collected notes on AI energy usage: https://simonwillison.net/tags/ai-energy-usage/