this post was submitted on 13 May 2026
154 points (97.5% liked)

LinkedinLunatics

6826 readers
202 users here now

A place to post ridiculous posts from linkedIn.com

(Full transparency.. a mod for this sub happens to work there.. but that doesn't influence his moderation or laughter at a lot of posts.)

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] inari@piefed.zip 4 points 3 hours ago (2 children)

I have no problem with them using search engines. They can vet and choose answers from reliable sources. From an LLM, it's anybody's guess if anything it pulled up is correct, and a less experienced doctor could be misled into making a dangerous mistake.

[โ€“] Eheran@lemmy.world -3 points 3 hours ago (1 children)

When was the last time you used them? They can provide sources for pretty much everything they say and that source usually also contains said thing too.

But even if not, even back 2 years ago, it was already good because you had a second look, a different perspective. A medical professional can either know little about everything or much about next to nothing. It should be really obvious how such a tool can help, even if it can not reach expert level.

"Don't worry, when you ask it for sources it gives you some. Sometimes they are even real! And sometimes the real ones even say the thing they were supposed to have said from the AI!"

Fucking lunacy.