this post was submitted on 14 Jan 2026
-4 points (40.0% liked)

science

23578 readers
434 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

founded 2 years ago
MODERATORS
all 7 comments
sorted by: hot top controversial new old
[–] CombatWombatEsq@lemmy.world 12 points 1 day ago (2 children)

I think this one’s getting downvoted by people who haven’t read the article. The argument proceeds that because llms respond like people with anxiety, depression, and ptsd, and because people with those conditions interact with llms, the llms are likely to intensify or exacerbate the symptoms in the humans that interact with them. The researchers weren’t trying to fix the llms through therapy.

[–] deliriousdreams@fedia.io 2 points 23 hours ago

I think people object to articles anthropomorphicizing LLM's and Generative AI.

People here are less likely to read articles where the headline does so.

[–] MonkderVierte@lemmy.zip 1 points 21 hours ago

Clickbaity title.

[–] gravitas_deficiency@sh.itjust.works 9 points 1 day ago (1 children)

This is so fucking dumb. All this is saying is that the researchers do not understand what LLMs actually are - that is, that they’re essentially just a bunch of markov chains layered on top of each other. They are not sentient or sapient.

Stop fucking anthropomorphizing LLMs

[–] CombatWombatEsq@lemmy.world 12 points 1 day ago (1 children)

Have you considered the possibility that the kinds of researchers who publish in nature may have taken the time to do some basic research into how llms work before commissioning a study, and that may not be what’s happening here?

[–] Quibblekrust@thelemmy.club 4 points 1 day ago

Who cares!? Commenting on Lemmy is about expressing IMPOTENT NERD RAGE! It has nothing to do with truth or facts. I simply want to be ANGRY and yell in all caps about SOMETHING at least twice a day! And I will do so without even reading the article!

SO SHUT UP AND LET ME RAGE!!!

.-- Original commenter, probably