this post was submitted on 25 Feb 2026
11 points (59.6% liked)

Technology

82070 readers
3516 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Skullgrid@lemmy.world 2 points 5 days ago (1 children)

In this paper, we conduct a systematic investigation into hallucination-associated neurons (H-Neurons)

no, they have to be the nodes responsible for the creation of hallucinations

[–] XLE@piefed.social 8 points 5 days ago* (last edited 5 days ago) (1 children)

And a "hallucination" is also an inaccurate humanization of the actual meaning: "statistical relationship that we AI folks don't like."

"Hallucinations" even include accurate data.

It is a trash marketing buzzword.

[–] Skullgrid@lemmy.world 4 points 5 days ago (3 children)

did you know that there is no sex going on in a Breeder Reactor?

https://en.wikipedia.org/wiki/Breeder_reactor

They're analogies to help us communicate ideas.

[–] snooggums@piefed.world 2 points 5 days ago (1 children)

A breeder reactor is creating something, which is like the outcome of breeding. That name fits.

[–] Skullgrid@lemmy.world 1 points 5 days ago (2 children)

a hallucination is seeing something that's not there, which also fits.

[–] snooggums@piefed.world 1 points 5 days ago

Hallucinations requires perception. LLMs are just statistical models and do not have perceptions.

It was a cute name early on, now it is used to deflect when the output is just plain wrong.

[–] XLE@piefed.social 1 points 5 days ago

In AI, a "hallucination" is just as much "there" as a non-"hallucination." It's a way for scientists to stomp their foot and say that the wrong output is the computer's fault and not a natural consequence of how LLMs work.

[–] athairmor@lemmy.world 1 points 4 days ago

Nuclear energy companies aren’t trying to make people think that their reactors reproduce.

AI companies are trying to make people think that their software is intelligent.

The context matters.

[–] Bronzebeard@lemmy.zip 1 points 5 days ago

I don't think anyone is confusing radiation propagation with being alive though.

The issue is, these things "communicate" with us so granting it even more leeway to seem like it's thinking (it's not) is only further muddying how people perceive them