this post was submitted on 04 Feb 2026
619 points (98.6% liked)

Fuck AI

5502 readers
1039 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pseudo@jlai.lu 52 points 5 days ago (4 children)

Why use an LLM to solve a problem you could solve using an alarm clock and a post it.

[–] enbiousenvy@lemmy.blahaj.zone 34 points 5 days ago* (last edited 5 days ago) (2 children)

programming nitpicks (for the lack of better word) that I used to hear:

  • "don't use u32, you won't need that much data"
  • "don't use using namespace std"
  • "sqrt is expensive, if necessary cache it outside loop"
  • "I made my own vector type because the one from standard lib is inefficient"

then this person implemeting time checking work via LLM over network and costs $0.75 each check lol

[–] cecilkorik@piefed.ca 20 points 5 days ago (1 children)

We used to call that premature optimization. Now we complain tasks don't have enough AI de-optimization. We must all redesign things that we have done in traditional, boring not-AI ways, and create new ways to do them slower, millions or billions of times more computationally intensive, more random, and less reliable! The market demands it!

[–] very_well_lost@lemmy.world 15 points 5 days ago* (last edited 4 days ago)

I call this shit zero-sum optimization. In order to "optimize" for the desires of management, you always have to deoptimize something else.

Before AI became the tech craze du jour I had a VP get obsessed with microservices (because that's what Netflix uses so it must be good). We had to tear apart a mature and very efficient app and turn it into hundreds of separate microservices... all of which took ~100 milliseconds to interoperate across the network. Pages that used to take 2 seconds to serve before now took 5 or 10 because of all the new latency required to do things they used to be able to do basically for free. And it's not like this was a surprise. We knew this was going to happen.

But hey, at least our app became more "modern" or whatever...

[–] AnyOldName3@lemmy.world 9 points 5 days ago

using namespace std is still an effective way to shoot yourself in the foot, and if anything is a bigger problem than it was in the past now that std has decades worth of extra stuff in it that could have a name collision with something in your code.

[–] Rooster326@programming.dev 5 points 4 days ago

Nooo you don't understand. It needs it to be wrong up to 60% of the time. He would need a broken clock, a window and a post it note.

[–] rumba@lemmy.zip 2 points 5 days ago

For the clicks.

[–] Prior_Industry@lemmy.world 1 points 5 days ago (1 children)

Or if your being fancy poll a time server

[–] pseudo@jlai.lu 2 points 5 days ago (1 children)

That would work great as well but an alarm clock is a technology developped in the middle age.

[–] Prior_Industry@lemmy.world 2 points 5 days ago* (last edited 5 days ago) (1 children)

Or go off grid style and leave your curtains open 😂

[–] pseudo@jlai.lu 1 points 4 days ago (1 children)

You just need of a bit of mud to draw a reminder on the window.

[–] Prior_Industry@lemmy.world 2 points 3 days ago

Tactile touch interface