this post was submitted on 16 Aug 2025
791 points (99.3% liked)
People Twitter
7949 readers
758 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's called RAG, and it's the only "right" way to get any accurate information out of an LLM. And even it is not perfect. Not by far.
You can use it without an LLM. It's basically keyword search. You still have to know what you are asking, so you have to study. Study without an imprecise LLM that can feed you false information that sounds plausible.
There are other problems with current LLMs that make them problematic. Sure you will catch onto those problems if you use them, and you still have to know more about the topic then them.
They are a fun toy and ok for low-stakes knowledge (ex cooking recipies). But as a tool in serious work they are a rubber ducky at best.
PS What the guy couple comments above said about sources, that's probably about web search. Even when an LLM reads the sources it can missinterpet them easily. Like how apple removed their summaries because they were often just wrong.
Let's not move the goal post. OP post is about med students using GPT to pass their exam in a successful manner. As another comment put it, it's not about Karen using GPT to diagnose pops, it's about trained professionals using an AI tool to assist them.
And yet, all we get is a bunch of people spewing vague FUD and spitballing opinions as if they're proven facts, or as if AI has stopped evolving and the current limitations are never going to be surpassed.
Problem with trained professionals using cheating to pass exams is that they are prone to become way less trained and not such professionals in the process