this post was submitted on 01 Dec 2025
302 points (93.4% liked)

Fuck AI

4728 readers
1151 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Lojcs@piefed.social 7 points 6 days ago* (last edited 6 days ago) (1 children)
  1. People using it as a replacement for another person, a search engine or for generating media. These aren't things you used to use a PC for.
  2. It's factually not.
  3. Obviously. I'm not saying price increases are a good thing. My point is that higher prices won't lead to people flocking to LLMs, they'll keep using what they have even if it's slow.
  4. Yet for oop's theory to be viable they need to stay up for longer than the current computers stop being viable.
  5. I don't question that the AI boom is the cause of price increases. The problem is that even if you alter the theory to be about cloud computing in general the fact that we still have PCs disproved it.
  6. I didn't see anything about this being about local LLMs. I'm sorry for the people whose primary pc use case was self hosted LLMs but didn't have the memory to run them..

Maybe this'll slow down the adoption of self hosted LLMs, but most people either need a computer for something they can't use an LLM for or they already use an online LLM for it.

[–] Grimy@lemmy.world -1 points 6 days ago (1 children)

Maybe this'll slow down the adoption of self hosted LLMs

That is what the tweet is about...?

The whole point is that llms need resources that the current average computer doesn't have, and they are locking away the higher end computers behind price hikes.

They are making it so online llms are the only option.

I didn't see anything about this being about local LLMs

What? He talks about selling computation as a service and specifically mentions chatgpt. It's not about stadia.

[–] Postimo@lemmy.zip 3 points 6 days ago

They mention "all of your computation as a service" and "ChatGPT to do everything for you", it seems several other comments in the thread, the person you're replying to, and myself read this as a comment about pushing people towards cloud computing. I don't think that's an unreasonable read especially considering the major hardware spike is in ram, not vram or graphics cards, which would be more a comment on self hosted LLMs.

Further local hosting of LLMs is already pretty outside the mindset of any regular user, and will likely never be comparable to cloud LLMs. The idea that they are intentionally driving up ram prices to hundreds of dollars as a direct method of boxing out the self hosted LLM linux nerds that want DRAM bound models is possibly more absurd.