joonazan

joined 2 years ago

My guess would be that using a desktop computer to make the queries and read the results consumes more power than the LLM, at least in the case of quickly answering models.

The expensive part is training a model but usage is most likely not sold at a loss, so it can't use an unreasonable amount of energy.

Instead of this ridiculous energy argument, we should focus on the fact that AI (and other products that money is thrown at) aren't actually that useful but companies control the narrative. AI is particularly successful here with every CEO wanting in on it and people afraid it is so good it will end the world.