this post was submitted on 26 Jan 2026
75 points (92.1% liked)

Futurology

3715 readers
13 users here now

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pulsewidth@lemmy.world 0 points 1 day ago

Correct spelling is the fundamental component of words, without words there is no vocabulary. Without understanding words, LLMs have absolutely no understanding of vocabulary. They can certainly spew out things they've tokenized and weighted from ingested inputs though - like when people trick it into believing false definitions through simply repeating them as correct and thereby manipulating (or poisoning) the weighting. ChatGPT and other LLMs regularly fail to interpret common parts of vocabulary - eg idioms, word spellings, action-reaction consequences in a sentence. They're fancy autocomplete, filled with stolen (and occasionally licensed) data.

Sure seems like the problem isn't me or the other guy 'dont know how to use LLMs', but rather that they keep getting sold as something they're not.

Congrats though, you just used a 100 billion dollar machine array to more or less output the exact content of a Wikipedia article - you really proved your point that it's very good when you know what to ask it, and us plebs are just dumb at questions, or something 👍 https://en.wikipedia.org/wiki/Platitude