this post was submitted on 16 Feb 2026
438 points (89.4% liked)
Programmer Humor
29837 readers
520 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
LLMs are part of AI, so I think you're maybe confused. You can say anything is just fancy anything, that doesn't really hold any weight. You are familiar with autocomplete, so you try to contextualize LLMs in your narrow understanding of this tech. That's fine, but you should actually read up because the whole field is really neat.
Literally, LLMs are extensions of the techniques developed for autocomplete in phones. There's a direct lineage. Same fundamental mathematics under the hood, but given a humongous scope.
That's not true.
How is this untrue? Generative pre-training is literally training the model to predict what might come next in a given text.
That's not what an LLM is. That's part of how it works, but it's not the whole process.
They never claimed that it was the whole thing. Only that it was part of it.