this post was submitted on 16 Feb 2026
438 points (89.4% liked)

Programmer Humor

29837 readers
520 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 

Not sure if this is the best community to post in; please let me know if there's a more appropriate one. AFAIK Aii@programming.dev is meant for news and articles only.

you are viewing a single comment's thread
view the rest of the comments
[–] zd9@lemmy.world 7 points 2 days ago (1 children)

LLMs are part of AI, so I think you're maybe confused. You can say anything is just fancy anything, that doesn't really hold any weight. You are familiar with autocomplete, so you try to contextualize LLMs in your narrow understanding of this tech. That's fine, but you should actually read up because the whole field is really neat.

[–] AppleTea@lemmy.zip 1 points 2 days ago (1 children)

Literally, LLMs are extensions of the techniques developed for autocomplete in phones. There's a direct lineage. Same fundamental mathematics under the hood, but given a humongous scope.

[–] CeeBee_Eh@lemmy.world 0 points 2 days ago (1 children)

LLMs are extensions of the techniques developed for autocomplete in phones. There's a direct lineage

That's not true.

[–] howrar@lemmy.ca 2 points 1 day ago (1 children)

How is this untrue? Generative pre-training is literally training the model to predict what might come next in a given text.

[–] CeeBee_Eh@lemmy.world 1 points 1 day ago (1 children)

That's not what an LLM is. That's part of how it works, but it's not the whole process.

[–] howrar@lemmy.ca 2 points 1 day ago

They never claimed that it was the whole thing. Only that it was part of it.