this post was submitted on 23 Feb 2026
30 points (100.0% liked)

Chapotraphouse

14325 readers
618 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 5 years ago
MODERATORS
 

Before we had woke, we had DEI. Before we had DEI, we had PC culture.

Before we had doxxing we had invasion of privacy.

Has the term AI just repurposed the concept of….pagerank?

Look, I am but a layman, but i swear to fucking christ AI is just a glorified search engine and they have made google search unusable on purpose.

you are viewing a single comment's thread
view the rest of the comments
[–] JoeByeThen@hexbear.net 7 points 1 month ago (1 children)

If it's on the search page, the AI is just summarizing the shitty search results. doomer

[–] jack@hexbear.net 15 points 1 month ago (1 children)

No, it can absolutely cook up novel hallucinations

[–] JoeByeThen@hexbear.net 8 points 1 month ago (1 children)

Sure. But typically that's not what's happening. Gemini on the Google search page is literally just being fed the Top X search results and being told to summarize the results in the context of the search query. It may come to some wild conclusions in the process but, more often than not, it's platforming the search result information above its own training.

[–] invalidusernamelol@hexbear.net 1 points 1 month ago* (last edited 1 month ago) (1 children)

The results are also cached and use a smaller model for speed/cost reduction so it does definitely hallucinate.

[–] JoeByeThen@hexbear.net 1 points 1 month ago (1 children)

jesse-wtf

The very act of summarizing is technically a hallucination, that's where the "creativity" comes from. Is it possibly doing so, yes. Is it much more likely that the inaccuracy is coming from the shitty results that we know is being elevated above it's own training, as we've seen literally since they started rolling it out, probably. Leveraging this crap is basically modern day SEO.

[–] invalidusernamelol@hexbear.net 1 points 1 month ago* (last edited 1 month ago)

A 'hallucination' in my mind is specifically when it fabricates direct lies that are contradicted by evidence (that's usually in the same search results).

E.g. confidently giving you incorrect ISBN or DOI nubers for things. Which is does a lot. It also just makes up APIs for real, existing, documented APIs because it assumes patterns that don't exist for that specific project.

The whole 'slop squatting' exploit is based around those types of hallucinations. People have successfully noticed fake npm packages being imported repeatedly by LLM codebases and made them so they would actually install...