225
Google Search AI hallucinations push Google to hire "AI Answers Quality" engineers
(www.bleepingcomputer.com)
This is a most excellent place for technology news and articles.
This is how you know these things are fucking worthless because the people in charge of them think they can combat this by using anti hallucination clauses in the prompt as if the AI would know how to tell it was hallucinating. It already classified it as plausible output by creating it!
They try to do security the same way, by adding "pwease dont use dangerous shell commands" to the system prompt.
Security researchers have dubbed it "Prompt Begging"
Its been over a hundred years since this quote and people still think computers are magic.