this post was submitted on 26 Nov 2025
90 points (100.0% liked)

technology

24104 readers
457 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
 

Facing five lawsuits alleging wrongful deaths, OpenAI lobbed its first defense Tuesday, denying in a court filing that ChatGPT caused a teen’s suicide and instead arguing the teen violated terms that prohibit discussing suicide or self-harm with the chatbot.


“They abjectly ignore all of the damning facts we have put forward: how GPT-4o was rushed to market without full testing. That OpenAI twice changed its Model Spec to require ChatGPT to engage in self-harm discussions. That ChatGPT counseled Adam away from telling his parents about his suicidal ideation and actively helped him plan a ‘beautiful suicide,’” Edelson (family's lawyer) said. “And OpenAI and Sam Altman have no explanation for the last hours of Adam’s life, when ChatGPT gave him a pep talk and then offered to write a suicide note.”

you are viewing a single comment's thread
view the rest of the comments
[–] WhyEssEff@hexbear.net 16 points 1 day ago* (last edited 1 day ago) (1 children)

using your ToS as a defense despite your ToS objectively failing here is not a good precedent to set for the sanctity of your ToS catgirl-huh

[–] driving_crooner@lemmy.eco.br 3 points 23 hours ago (3 children)

Why arw answering yo yourself? Looks like a bot.

[–] WhyEssEff@hexbear.net 12 points 21 hours ago (1 children)

sorry for having consecutive thoughts won't happen again lea-sad

[–] FunkyStuff@hexbear.net 3 points 19 hours ago

You gotta become the aliens from Arrival and have all your thoughts for all events that will ever occur available ahead of time.

[–] FunkyStuff@hexbear.net 4 points 19 hours ago

She's literally our best poster

[–] TanneriusFromRome@hexbear.net 5 points 23 hours ago (1 children)

Nah, YSF is a long time user, and has been investigated already

[–] TanneriusFromRome@hexbear.net 6 points 23 hours ago

also, y answer bot. sus