this post was submitted on 29 Aug 2025
853 points (98.6% liked)

Technology

74599 readers
3042 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] BootLoop@sh.itjust.works 6 points 1 day ago (5 children)

LLMs, with a little coaxing, perform well at returning well formed JSON.

[–] Khanzarate@lemmy.world 11 points 1 day ago (4 children)

They do, my concern is more about if that JSON is correct, not just well-formed.

Also, 18000 waters might be correct JSON, but makes an AI a bad cashier.

[–] staph@sopuli.xyz 7 points 1 day ago* (last edited 1 day ago) (3 children)

There is a lot more that goes into it than just being correct. 18000 waters may have been the actual order, because somebody decided to screw with the machine. A human who isn't terminally autistic would reliably interpret that as a joke and would simply refuse to punch that in. The LLM will likely do what a human tells it to do, since it has no contextual awareness, it only has the system prompt and whatever interaction with the user it had so far.

[–] tomiant@programming.dev 1 points 1 day ago* (last edited 1 day ago) (1 children)

So they just trim the instructions so it doesn't take joke orders, so it can make more reasonable decisions, like:

"May I take your order?"

"Two double whoppers with extra mayo and a chocolate cherry banana sundae"

"Oh you've GOTTA be joking!"

[–] staph@sopuli.xyz 2 points 1 day ago

It's trivial to get LLMs to act against the instructions

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)