this post was submitted on 27 Aug 2025
486 points (96.4% liked)

Technology

74873 readers
3221 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The makers of ChatGPT are changing the way it responds to users who show mental and emotional distress after legal action from the family of 16-year-old Adam Raine, who killed himself after months of conversations with the chatbot.

Open AI admitted its systems could “fall short” and said it would install “stronger guardrails around sensitive content and risky behaviors” for users under 18.

The $500bn (£372bn) San Francisco AI company said it would also introduce parental controls to allow parents “options to gain more insight into, and shape, how their teens use ChatGPT”, but has yet to provide details about how these would work.

Adam, from California, killed himself in April after what his family’s lawyer called “months of encouragement from ChatGPT”. The teenager’s family is suing Open AI and its chief executive and co-founder, Sam Altman, alleging that the version of ChatGPT at that time, known as 4o, was “rushed to market … despite clear safety issues”.

you are viewing a single comment's thread
view the rest of the comments
[–] audaxdreik@pawb.social 38 points 1 week ago* (last edited 1 week ago) (6 children)

I definitely do not agree.

While they may not be entirely blameless, we have adults falling into this AI psychosis like the prominent OpenAI investor.

What regulations are in place to help with this? What tools for parents? Isn't this being shoved into literally every product in everything everwhere? Actually pushed on them in schools?

How does a parent monitor this? What exactly does a parent do? There could have been signs they could have seen in his behavior, but could they have STOPPED this situation from happening as it was?

This technology is still not well understood. I hope lawsuits like this shine some light on things and kick some asses. Get some regulation in place.

This is not the parent's fault and seeing so many people declare it just feels like apoligist AI hype.

[–] Scipitie@lemmy.dbzer0.com 9 points 1 week ago (5 children)

I see your point but there is one major difference between adults and children: adults are by default fully responsible for themselves z children are not.

As for your question: I won't blame the parents here in the slightest because they will likely put more than enough blame on themselves. Instead I'll try to keep it general:

Independent of technology, what a parent can do is learn behavior and communication patterns that can be signs of mental illness.

That's independent of the technology.

This is a big task because the border between normal puberty and behavior that warrants action is slim to non-existent.

Overall I wish for way better education for parents both in terms of age appropriate patterns as well as what kind of help is available to them depending on their country and culture.

[–] Spuddlesv2@lemmy.ca 12 points 1 week ago (2 children)

They already had the kid in therapy. That suggests they were involved enough in his life to know he needed professional help. Other than completely removing his independence, effectively becoming his jailers, what else should they have done?

[–] Scipitie@lemmy.dbzer0.com 2 points 1 week ago (1 children)

In the very first post on this thread I pointed out that I'm not talking about this specific case at all.

[–] Spuddlesv2@lemmy.ca 5 points 1 week ago* (last edited 1 week ago)

Fair enough but in the post I replied to you did say you won’t blame the parents “here” in the slightest, which to me means “here in this specific case”.

load more comments (2 replies)
load more comments (2 replies)