this post was submitted on 15 Feb 2026
1606 points (99.6% liked)

Fuck AI

6809 readers
722 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

link to archived Reddit thread; original post removed/deleted

(page 3) 50 comments
sorted by: hot top controversial new old
[–] ladicius@lemmy.world 8 points 2 months ago

Nice. Really, I like it when management is dumb as fuck. It's a world of never ending joy.

[–] Ghostie@lemmy.zip 8 points 2 months ago

Burn, baby! Burn

[–] BlameTheAntifa@lemmy.world 7 points 2 months ago
[–] BroBot9000@lemmy.world 6 points 2 months ago

Bwahahahahahahha 😂

[–] kokesh@lemmy.world 6 points 2 months ago

I must say i love this very much. Only this may put idiots leading companies that use this crap to ditch it.

[–] MoonManKipper@lemmy.world 5 points 2 months ago (10 children)

If true they’re all idiots, but I don’t believe the story anyway. All the data question answering LLMs I’ve seen use the LLM to write SQL queries on your databases and then wrap the output in a summary. So the summary is easy to check and very unlikely to be significantly wrong. AI/ML/statistics and code is a tool, use it for what it’s good for, don’t use it for what it’s not, treat hype with skepticism

[–] jj4211@lemmy.world 6 points 2 months ago

I'm on the fence, but will say that if, for whatever reason, it was never actually connected to the data or the connection had some flaw, I could totally believe it would just fabricate a report that looks consistent with what the request asked for. Maybe it failed to ever convey that an error occurred. Maybe it conveyed the lack of data and the user thought he could just tell the AI to fix the problem without trying to understand it himself and triggered it to generate a narrative consistent with fixing it without actually being able to fix it.

Sure if you do a sanity check it should fall apart, but that assumes they bother. Some people have crazy confidence in LLM and didn't even check.

[–] sp3ctr4l@lemmy.dbzer0.com 5 points 2 months ago

Clearly you've never worked as a data analyst, or you would know that the vast majority of upper management and C Suite are, in fact, all fucking idiots.

They're generally where they are because of mutual secrets and nepotism, for who else is on their contact list.

load more comments (8 replies)
[–] Cactusfighter@lemmy.org 5 points 2 months ago
[–] Avicenna@programming.dev 4 points 2 months ago* (last edited 2 months ago) (1 children)

I mean it hallucinates numbers when you ask it to extract some numeric daha publicly available online so yeah...

load more comments (1 replies)
load more comments
view more: ‹ prev next ›