this post was submitted on 07 Jan 2026
225 points (99.1% liked)

Technology

78511 readers
2983 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Deestan@lemmy.world 12 points 4 days ago (1 children)

Agree, the term is misleading.

Talking about hallucinations lets us talk about undesired output as a completely different thing than desires output, which implies it can be handled somehow.

The problem it the LLM can only ever output bullshit. Often the bullshit is decent and we call it output, and sometimes the bullshit is wrong and we call it hallucination.

But it's the exact same thing from the LLM. You can't make it detect it or promise not to make it.

[–] underisk@lemmy.ml 7 points 4 days ago (1 children)

You can’t make it detect it or promise not to make it.

This is how you know these things are fucking worthless because the people in charge of them think they can combat this by using anti hallucination clauses in the prompt as if the AI would know how to tell it was hallucinating. It already classified it as plausible output by creating it!

[–] Deestan@lemmy.world 5 points 4 days ago (1 children)

They try to do security the same way, by adding "pwease dont use dangerous shell commands" to the system prompt.

Security researchers have dubbed it "Prompt Begging"

[–] underisk@lemmy.ml 3 points 3 days ago* (last edited 3 days ago)

"On two occasions I have been asked, – "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" ... I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question"

Its been over a hundred years since this quote and people still think computers are magic.