this post was submitted on 16 Aug 2025
873 points (99.0% liked)
People Twitter
7955 readers
742 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The current limitations of LLMs are built in how they fundementaly work. We would need something completely new. That is a fact.
Honestly the thought of med students using them to pass exams scares me.
Sure, use them to replace CEOs of some unimportant companies like facebook. But they are not for jobs where other peoples lives are at stake. They inherently halucinate (like many CEOs). It is built in in how they work.
I don't think the bar will be where you're setting it.
Suppose a new cancer drug or something comes out that significantly improves the life expectancy and quality of patients. In rare cases however, it can cause serious liver complications that may be fatal. Should this drug be used, or not?
It's not trivial, but there's a chance that it would in fact be used.
My point with AI hallucinations is that they're the same. If at some point it's proven that it leads to better patient outcomes, but can have side effects, should it be outright discarded?