this post was submitted on 27 Apr 2026
557 points (98.9% liked)

Technology

84146 readers
2422 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mech@feddit.org 78 points 4 hours ago (5 children)

It's so weird how these chatbots always pretend they learnt something after they fuck up.
They literally can't.

[–] frongt@lemmy.zip 16 points 2 hours ago (1 children)

They're not even pretending. The algorithm says the most likely response to "you fucked up" is "I'm sorry", so that's what it prints. There's zero psychological simulation going on, only statistical text generation.

[–] Hacksaw@lemmy.ca 8 points 1 hour ago

I actually didn't believe you but it's literally true. First post, immediate apology.

[–] ech@lemmy.ca 19 points 3 hours ago

The program can't pretend any more than it can tell truth. It's all just impressive regurgitation. Querying it as to why it "chose" to take any action is about as useful as interrogating a boulder on why it "chose" to roll through a house.

[–] SkaveRat@discuss.tchncs.de 18 points 4 hours ago

I mean, they probably do. until it gets purged from the context window. then it just yolos again

[–] thisbenzingring@lemmy.today 2 points 4 hours ago

the next ingestion cycle will probably pick it up but how do we know it'll use the information in any relevant way 😶

[–] nymnympseudonym@piefed.social -4 points 3 hours ago (1 children)

They literally can’t.

Only because we are still using vanilla LLMs instead of MAMBA or JEPA

Of course. If you shot your foot with a gun, the solution is surely a bigger gun.