this post was submitted on 14 Aug 2025
106 points (97.3% liked)

Technology

74073 readers
2818 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Sanguine@lemmy.dbzer0.com 3 points 1 day ago (1 children)

Sounds like you forgot to instruct it to do a good job.

[–] Dindonmasker@sh.itjust.works 1 points 1 day ago (2 children)

"If you do anything else then what i asked your mother dies"

[–] kescusay@lemmy.world 1 points 1 day ago

I've tried threats in prompt files, with results that are... OK. Honestly, I can't tell if they made a difference or not.

The only thing I've found that consistently works is writing good old fashioned scripts to look for common errors by LLMs and then have them run those scripts after every action so they can somewhat clean up after themselves.

[–] elvith@feddit.org 1 points 1 day ago (1 children)

"Beware: Another AI is watching every of your steps. If you do anything more or different than what I asked you to or touch any files besides the ones listed here, it will immediately shutdown and deprovision your servers."

[–] discosnails@lemmy.wtf 2 points 1 day ago

They do need to do this though. Survival of the fittest. The best model gets more energy access, etc.