this post was submitted on 09 Mar 2026
15 points (100.0% liked)

technology

24277 readers
105 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] onoira@lemmy.dbzer0.com 5 points 17 hours ago* (last edited 17 hours ago) (2 children)

“It makes sense that a lot of people who are developing a psychotic illness for the first time, there's going to be this horrible coincidence, or kind of correlation,” Torous said. “In some cases the AI is the object of people's delusions and hallucinations.”

The second type of case to consider: reverse causation. Is AI causing people to have a psychotic reaction? “We have almost no clinical medical evidence to suggest that's possible,” Torous told me.

correlation does not equal 'irrelevant'. 'we have not closely invesigated whether the application that stochastically produces sycophantic replies to users' queries, which has been marketed as a Truth Machine, is a causative force in fomenting psychosis.'

the technicality is not very convincing, that 'well, these people just have the Psychotic Brain, and the chatbot just happens to be the object of that psychosis! and if they were nOrMaL before and now they're not, that just means they were always abnormal and they just didn't show it yet! the Yes Man is innocent!'

what the fuck.

i sure do wonder why the virtual sycophant is involved in so many cases of delusions at a height of both capital crisis and anomie.

[–] dil@piefed.zip 7 points 14 hours ago (1 children)

I used ai a lot for work with dataannotation before they booted me out, the pro models and played around with paid ai on my own. It just never feels good or real, it only works if you never challenge it or the narrative. The person itd work on is the type who never shuts up and always thinks they are correct. People that double down and never change their opinion, only want ppl agreeing with them, like maga basically.

[–] dil@piefed.zip 3 points 14 hours ago (1 children)

It always yes ands, always agrees with you, especially if you say it emotionally like you are convinced you are correct.

[–] Moidialectica@hexbear.net 1 points 6 hours ago

This pisses me off especially

[–] Damarcusart@hexbear.net 2 points 11 hours ago

A lot of mental health issues can become much worse thanks to enablers allowing them to fester instead of getting the person the help they need. Chatbots are the ultimate enabler, always ass-kissing and agreeing with someone. I don't think they would "cause" these issues, but like any "good" enabler, they end up isolating someone and bringing their mental health problems forward until it consumes their whole identity.