this post was submitted on 28 Apr 2026
853 points (93.7% liked)

Fuck AI

6809 readers
913 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] jj4211@lemmy.world 5 points 1 day ago

Stuff like Cerebras API, anyone hosting Deepseek v4, self hosting Qwen 27B or RAG models or whatever all use less energy than your computer will use as you read this comment.

This is absolutely just not the case. A couple of minutes of an idling end user device does not use as many joules as a few seconds of a self hosted model. There are other tasks that will be as intensive, but reading static text in a browser won't do it. That's not to say it's an unforgiveable waste of resources on a personal level or anything, just that your comparison is a bit busted.

The hosted models in pursuit of going faster take disproportionately more energy, analagous to how a redline engine burns way more fuel than a modest operating engine.