Basically: if something useful gets cheaper, people may end up using it way more such that the total energy usage increase is materially large.
I do think there's a risk of that happening here: an AI-assisted search is inevitably more expensive than a single web search (since it runs potentially dozens of additional searches on top of the LLM inference cost).
I could counter that and say that now I can resolve a complex question in 30s with a single typed prompt, where beforehand I might have spent 10 minutes running my laptop and running dozens of searches by hand, for a net increase in spent energy.
... but actually, for trivial curiosity questions I just wouldn't have bothered looking for an answer at all.
I value curiosity deeply though, so maybe it's worth the world burning a few more gigawatts if the result is a more curious population?
Basically: if something useful gets cheaper, people may end up using it way more such that the total energy usage increase is materially large.
I do think there's a risk of that happening here: an AI-assisted search is inevitably more expensive than a single web search (since it runs potentially dozens of additional searches on top of the LLM inference cost).
I could counter that and say that now I can resolve a complex question in 30s with a single typed prompt, where beforehand I might have spent 10 minutes running my laptop and running dozens of searches by hand, for a net increase in spent energy.
... but actually, for trivial curiosity questions I just wouldn't have bothered looking for an answer at all.
I value curiosity deeply though, so maybe it's worth the world burning a few more gigawatts if the result is a more curious population?