this post was submitted on 15 Jan 2026
43 points (64.8% liked)
Technology
78964 readers
3874 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's not what I meant. When you say "it makes stuff up" you are describing how the model statistically predicts the expected output.
You know that. I know that.
That's the asterisk. The more in-depth explanation a lot of people won't bother getting far enough to learn about. Someone who doesn't read that far into it, can read that same phrase and assume that we're discussing what type of personality LLMs exhibit, that they are "liars". But they'd be wrong. Neither of us is attributing intention to it or discussing what kind of "person" it is, in reality we're referring to the fact that it's "just" a really complex probability engine that can't "know" anything.
No matter what word we use, if it is pre-existing, it will come with pre-existing meanings that are kinda right, but also not quite, requiring that everyone involved in a discussion know things that won't be explained every time a term or phrase is used.
The language isn't "inaccurate" between you and me because you and I know the technical definition, and therefore what aspect of LLMs is being discussed.
Terminology that is "accurate" without this context does not and cannot exist, short of coming up with completely new words.
You could say "the model's output was inaccurate" or something like that, but it would be much more stilted.