this post was submitted on 05 Mar 2026
827 points (98.2% liked)

Technology

82332 readers
3387 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] architect@thelemmy.club 3 points 13 hours ago (6 children)

I can’t be the only one that thinks if you do stupid illegal shit that your crazy uncle told you/voices in your head told you/AI mirror told you you don’t get to use the excuse that you were just following orders from any of those options.

[–] dream_weasel@sh.itjust.works 3 points 7 hours ago* (last edited 7 hours ago)

The difference is when a LLM tells you, it's news.

Besides, what are you gonna do if you ask AI how many rocks to eat? NOT eat rocks? People can't handle responsibility like that.

[–] Snowclone@lemmy.world 6 points 10 hours ago* (last edited 10 hours ago)

That's not the problem. the problem is having a "lets turn Chris' mental illness that's harmed no one so far, into everyone's violent problem!" machine.

that's a bad machine.

[–] Objection@lemmy.ml 2 points 10 hours ago

This is such an individualist framing.

[–] TheTimeKnife@lemmy.world 0 points 9 hours ago (1 children)

So you think its more simple to solve mental illness than regulate a few tech bros making suicide assistance chat bots?

[–] Hazor@lemmy.world 0 points 7 hours ago

Not just suicide assistance chat bots, but suicide promotion chat bots.

[–] AeonFelis@lemmy.world 2 points 10 hours ago

Floridaman is not making any excuses here. He can't. Because he's dead.

[–] moonshadow@slrpnk.net 2 points 12 hours ago

Power imbalance is what validates that excuse. Orders from crazy uncle is a great excuse, at least until you're 10 or so. Billion+ dollar llm company has a lot more resources, capability, and therefore responsibility than the poor bastards engaged with it