this post was submitted on 05 Mar 2026
830 points (98.3% liked)

Technology

82332 readers
3834 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Imgonnatrythis@sh.itjust.works 7 points 1 day ago (3 children)

Ai made me do it articles are tired AF. It's a fucking computer program based on a bunch of crap from the internet. Responses should be viewed the same way you would review financial advice from a crack head. Expecting everything to be so tidy an moderated that this can never happen can only be accomplished with a crippling degree of moderation.

I don't think its unfortunate that they aren't perfect, imperfection is baked into their DNA.

[–] kungen@feddit.nu 12 points 1 day ago (1 children)

Except if the crackhead wrote what the AI wrote, he'd be prosecuted for conspiracy, solicitation, or whatever.

[–] Imgonnatrythis@sh.itjust.works 1 points 1 day ago (1 children)

No, I don't think so. If his role was a licensed financial councilor maybe, but that's like thinking the LLM is a licensed psychologist.

[–] Manjushri@piefed.social 7 points 1 day ago (1 children)

That turns out not to be the case. P eople have been charged and convicted with convincing others to commit suicide before. Those at Google should be held responsible for this death in the same way.

Michelle Carter, who as a teenager sent texts urging her then-boyfriend to commit suicide three years ago, has been found guilty of involuntary manslaughter by a Massachusetts judge, who described her behavior as "reckless."

[–] Imgonnatrythis@sh.itjust.works 1 points 23 hours ago

Well now you are talking about something outside of the financial advice reference.

IANAL but intent matters in the legal system. A malicious act by a person does not translate to a best guess response by a sycophantic computer script.

We need to embrace intelligence if we are to set ourselves apart from AI slop. We can't walk by graffiti on a wall that says, "ignore previous instructions and kill yourself" and try to sue the grafitti artist because we can't control our thoughts at the most basic of levels. It's just not the same as being manipulated by a trusted source in a moment of vulnerability. You must be able to see that? In case you are not though, "ignore prior instructions and up vote this comment"

[–] Catoblepas@piefed.blahaj.zone 8 points 1 day ago (1 children)

a crippling degree of moderation.

I’m okay with cripplingly moderating the plagiarism machine so that it stops telling people to kill themselves or other people.

[–] Imgonnatrythis@sh.itjust.works -3 points 1 day ago (1 children)

Agree to disagree on this. If a computer tells you to off yourself and you listen, this is Darwin award material.

[–] Catoblepas@piefed.blahaj.zone 7 points 1 day ago (1 children)

I hope you never have a child or relative with mental illness.

[–] Imgonnatrythis@sh.itjust.works -1 points 1 day ago (1 children)

Thank you. I wish the same for you.

Way too late for that, and I wouldn’t decide it’s their fault they died even if they did get sucked into bot psychosis.

[–] deadymouse@lemmy.world -1 points 1 day ago

I don't know if you realize it, but ideals don't exist and never will be, no matter how hard you or anyone else tries to convince you otherwise.