this post was submitted on 11 Feb 2026
27 points (96.6% liked)
Technology
80978 readers
4664 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is interesting. I think it suggests that the AI psychosis problem is bigger than we realize. All we know of now is a few high profile cases, without any real stats on this issue. But Sharma seems to know enough to be disturbed by the scale of this issue. Anecdotally this matches my own experience, as I think I’ve come across at least one person suffering from it.
I suspect that AI psychosis will become widely acknowledged problem within the next decade, similar to how it’s now widely acknowledged that Instagram and TikTok can trigger eating disorders.
Its kind of an obvious outcome if you combine the concept of automation bias with the hallucinations produced by LLMs. They really are perfect misinformation and confusion machines. Even the way we call them "hallucinations" already attributes more intelligence and agency to LLMs then they deserve.