this post was submitted on 23 Feb 2026
30 points (100.0% liked)
Chapotraphouse
14325 readers
673 users here now
Banned? DM Wmill to appeal.
No anti-nautilism posts. See: Eco-fascism Primer
Slop posts go in c/slop. Don't post low-hanging fruit here.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
No, it can absolutely cook up novel hallucinations
Sure. But typically that's not what's happening. Gemini on the Google search page is literally just being fed the Top X search results and being told to summarize the results in the context of the search query. It may come to some wild conclusions in the process but, more often than not, it's platforming the search result information above its own training.
The results are also cached and use a smaller model for speed/cost reduction so it does definitely hallucinate.
The very act of summarizing is technically a hallucination, that's where the "creativity" comes from. Is it possibly doing so, yes. Is it much more likely that the inaccuracy is coming from the shitty results that we know is being elevated above it's own training, as we've seen literally since they started rolling it out, probably. Leveraging this crap is basically modern day SEO.
A 'hallucination' in my mind is specifically when it fabricates direct lies that are contradicted by evidence (that's usually in the same search results).
E.g. confidently giving you incorrect ISBN or DOI nubers for things. Which is does a lot. It also just makes up APIs for real, existing, documented APIs because it assumes patterns that don't exist for that specific project.
The whole 'slop squatting' exploit is based around those types of hallucinations. People have successfully noticed fake npm packages being imported repeatedly by LLM codebases and made them so they would actually install...