this post was submitted on 28 Aug 2025
610 points (99.8% liked)

Technology

74643 readers
2487 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hotdogcharmer@lemmy.world 22 points 2 days ago (12 children)

Sam Altman belongs in prison. His machine encouraged and guided a child to kill themselves. His machine actively stopped that child seeking outside help. Sam Altman belongs in prison. Sam Altman does not need another $20,000,000,000,000. He needs to go through the legal system and be sentenced and sent to prison because his machine pushed a child to suicide.

[–] Electricd@lemmybefree.net -4 points 2 days ago* (last edited 2 days ago) (5 children)

Uses a tool the bad way despite it being public knowledge that it's bad for mental health

Was predisposed to mental health problems

Died, partly because they talked to a chatbot

"It's the chatbot's, creator fault", despite the chatbot never being made to cause those problems, and efforts being made to fix those problems

...

Yea nah, it's just anti-ai people doing their thing again and not being objective.

Get a better fight, such as hating on pharmaceutical laboratories companies pushing the use of extremely addictive substances for profit, despite them knowing the immense risk they cause to consumers, and financing false ads to make it safe.

If Sam Altman belongs in prison, it would either be:

  • Because he's destroying the planet (ecologically)
  • Because he stole lots of content to train his models
[–] Electricd@lemmybefree.net 0 points 2 days ago* (last edited 2 days ago) (2 children)

If you missuse some things at this point, then it's not the thing's fault

[–] DupaCycki@lemmy.world 3 points 2 days ago (1 children)

Some things are - on purpose - made easy to misuse and - by design - accessible to people, who are likely to misuse them. All this money, this supposedly cutting edge technology, and reporting to the police, but they aren't able to tell when a child is at risk and report it as well?

Smells like bullshit to me. More like they don't care. I'm not so sure children should even be allowed to use chatbots in the first place. Or only allowed to use versions specifically trained for interactions with children. But of course - banning children from accessing youtube and wikipedia is a much more pressing concern.

[–] Electricd@lemmybefree.net 1 points 2 days ago* (last edited 2 days ago)

They definitely prefer to spend their money on development, rather than adding safeguards

I don't believe people misusing ChatGPT helps them in any way, it's just that adding protections has a cost

but they aren’t able to tell when a child is at risk and report it as well?

Maybe police actually sorts and filters manually reports, but doesn't want to bother with mental health things? You know how the USA works, I don't believe OpenAI will go too far, they'll just randomly report.

Might even be reported for all I know, sometimes I just like to see the reaction of LLMs when I say I'll commit horrible stuff like school shootings or terrorism. The NSA will just feed it into their mass spying algorithm to check the most important profiles and this will be it

The war on drugs is so much more important than mental health detection, y'know. It sells more.

load more comments (2 replies)
load more comments (8 replies)