this post was submitted on 09 Feb 2026
558 points (98.9% liked)
Technology
80990 readers
4896 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You're over-egging it a bit. A well written SOAP note, HPI etc should distill to a handful of possibilities, that's true. That's the point of them.
The fact that the llm can interpret those notes 95% as well as a medical trained individual (per the article) to come up with the correct diagnosis is being a little under sold.
That's not nothing. Actually, that's a big fucking deal (tm) if you think thru the edge case applications. And remember, these are just general LLMs - and pretty old ones at that (ChatGPT 4 era). Were not even talking medical domain specific LLM.
Yeah; I think there's more here to think on.
If you think a word predictor is the same as a trained medical professional, I am so sorry for you...
Feel sorry for yourself. Your ignorance and biases are on full display.