this post was submitted on 15 Nov 2025
311 points (93.3% liked)

Technology

77084 readers
2625 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
311
LLMDeathCount.com (llmdeathcount.com)
submitted 2 weeks ago* (last edited 2 weeks ago) by brianpeiris@lemmy.ca to c/technology@lemmy.world
(page 2) 18 comments
sorted by: hot top controversial new old
[–] chunes@lemmy.world -4 points 2 weeks ago

LLM bad upvotes to the left please

[–] Fedditor385@lemmy.world -4 points 2 weeks ago (4 children)

I guess my opinion will be hugely unpopular but it is what it is - I'd argue it's natural selection and not an issue of LLM's in general.

Healthy and (emotionally) inteligent humans don't get killed by LLM's. They know it's a tool, they know it's just software. It's not a person and it does not guarantee correctness.

Getting killed because LLM's told you so - the person was in mental distress already and ready to harm themselves. The LLM's are basically just the straw that broke the camels back. Same thing with physical danger. If you believe drinking bleach helps with back pain - there is nothing that can save you from your own stupidity.

LLM's are like a knife. It can be a tool to prepare food or it can be a weapon. It's up to the one using it.

load more comments (4 replies)
[–] Sims@lemmy.ml -5 points 2 weeks ago (5 children)

I don't think "AI" is the problem here. Watching the watchers doesn't hurt, but I think the AI-haters are grasping for straws here. In fact, when comparing to the actual suicide numbers, this "AI is causing Suicide !" seems a bit contrived/hollow, tbh. Were the haters also as active in noticing the 49 thousand suicide deaths every year, or did they just now find it a problem ?

Besides, if there's a criminal here, it would be the private corp that provided the AI service, not a broad category of technology - "AI". People that hate AI, seem to really just hate the effects of Capitalism.

https://www.cdc.gov/suicide/facts/data.html (This is for US alone !) overview

If image not shown: Over 49,000 people died by suicide in 2023. 1 death every 11 minutes. Many adults think about suicide or attempt suicide. 12.8 million seriously thought about suicide. 3.7 million made a plan for suicide. 1.5 million attempted suicide.

load more comments (5 replies)
[–] lmmarsano@lemmynsfw.com -5 points 2 weeks ago* (last edited 2 weeks ago)
load more comments
view more: ‹ prev next ›