this post was submitted on 16 Feb 2026
377 points (89.7% liked)
Programmer Humor
29853 readers
177 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How is this untrue? Generative pre-training is literally training the model to predict what might come next in a given text.
That's not what an LLM is. That's part of how it works, but it's not the whole process.
They never claimed that it was the whole thing. Only that it was part of it.