IMHO -> you wouldn't need to write up such an article if people would think that AI adds an value to their life which is in replacable.
Example:
As of now AI is a big toy which you try to justify the use. A google search / fulltext search is much more efficient than using a AI Summary which you should by definition check after anyway.
You try to justify that we spending more electricity on a technology where we already have working solutions and will need those working solutions in the future too.
PS: I personally think the fundamental flaw in your article is that you define something can get replaced which is often not the case or you don't compare it to the current most used solution. Example -> Most books aren't printed anymore but only digitally published. The books which are printed needs to be printed as reference and to archive it long term or are printed for book lovers. So you can't say there will be 3000W less because it's not printed anymore.
@themurphy @rigatti There is one difference ... LLM's can't be more efficient there is an inherent limitation to the technology.
https://blog.dshr.org/2021/03/internet-archive-storage.html
In 2021 they used 200PB and they for sure didn't make a copy of the complete internet. Now ask yourself if all this information without loosing informations can fit into a 1TB Model ?? ( Sidenote deepseek r1 is 404GB so not even 1TB ) ... local llm's usually < 16GB ...
This technology has been and will be never able to 100% replicate the original informations.
It has a certain use ( Machine Learning has been used much longer already ) but not what people want it to be (imho).