390
Seven more families are now suing OpenAI over ChatGPT's role in suicides, delusions
(techcrunch.com)
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
ChatGPT has one million people talking about suicide on it daily. It's literally more dangerous than literal cardiovascular disease in the US and completely dwarfs every single traffic and gun death. It needs to get Ol' Yeller'd.
That's not how it works. Talking does not equate being encouraged to do it nor does it equate actual deaths.
By your logic, if a group acts out their violent fantasies in GTA 5, and then commits a shooting, I could say video games dwarf everything else by the sheer number of users.
There seems to be cases where chatgpt can be tricked or bugs into encouraging suicide. It has to be looked into but what you're advancing is pure unadulterated exaggeration. You are mixing up talking about suicide and being told to do it for one.
A mind that's vulnerable enough to openly talking about contemplating suicide is a mind that should be nowhere near a stochastic parrot. It is wildly dangerous.