this post was submitted on 04 Feb 2026
502 points (98.3% liked)

Fuck AI

5629 readers
1477 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] jaredwhite@humansare.social 4 points 1 day ago (1 children)

You know, you're entitled to your opinions, but you are most certainly not entitled to your facts.

The term "hallucinate" as used by people in AI research: https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)

P. S. A lay person's objections to the term's usage in popular media is entirely warranted as unnecessary anthropomorphizing. In general, this tendency to ascribe the language of human mental states to the outputs of statistical computer models is deeply problematic. See: https://firstmonday.org/ojs/index.php/fm/article/view/14366

[–] pixxelkick@lemmy.world -1 points 1 day ago

Nothing you linked there contradicts what I said. It expands on it in more specific detail.

LLMs are heuristic statistical token prediction engines.

Hallucinations are a shorthand term for a set of phenomena that arise out of the way the statistical prediction works, where it will string together sentences that are grammatically correct and sound right, but an LLM has no concept of right/wrong, only statistically likely next token given the prior.

That wiki article goes into much more depth on the "why" but it does support my statement.

I dunno what it is with people and linking wiki articles that support the person's statement and claiming its the opposite.

... learn to read I guess? I dunno lol.