this post was submitted on 07 Mar 2026
875 points (97.4% liked)

Technology

82378 readers
4000 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] thebestaquaman@lemmy.world 17 points 15 hours ago (1 children)

I mean, there's a good reason the first rules of firearm safety are to always treat a weapon as loaded, and to never direct the weapon at something you aren't prepared to destroy. The key point being that you never know when some freak accident can happen with a loose pin, bad ammo, a broken spring, or just a person tripping and shaking the gun a bit too hard.

A gun should never go off by itself. You still treat it as if it can, because in the real world freak accidents happen.

[–] artyom@piefed.social 0 points 15 hours ago (1 children)

Sure. The point is it's entirely possible to use a firearm safely. There is no safe use for LLMs because they "make decisions", for lack of a better phrase, for themselves, without any user input.

[–] etchinghillside@reddthat.com 8 points 15 hours ago (1 children)

That is not at all how LLMs work. It’s the software written around LLMs that aide it in constructing and running commands and “making decisions”. That same software can also prompt the user to confirm if they should do something or sandbox the actions in some way.