392
submitted 11 months ago by L4s@lemmy.world to c/technology@lemmy.world

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[-] CaptainEffort@sh.itjust.works 1 points 11 months ago

child sex abuse material is only illegal when children were abused in making it

This is literally why it’s illegal though. Because children are abused, permanently traumatized, or even killed in its making. Not because it disgusts us.

There are loads of things that make me want to be sick, but unless they actively hurt someone they shouldn’t be illegal.

this post was submitted on 08 Dec 2023
392 points (93.2% liked)

Technology

59415 readers
1527 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS