Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.
atomicorange
Yeah there’s some nasty shit here. Big yikes, Lemmy.
Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?
The harm is:
- Those photos now exist in the world and can lead to direct harm to the victim by their exposure
- it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
- The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.
- your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.
I don’t know where you’re getting this “thought crime” stuff. They’re talking about boys distributing deepfake nudes of their classmates. They’re not talking about individuals fantasizing in the privacy of their own homes. You have to read all of the words in the sentences, my friend.
Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.
How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?
They may be little sociopaths, but they don’t run around murdering each other. Our culture hammers it into their brains that murder is wrong and they will be severely punished for it. We need to build a culture where little boys are afraid to distribute naked photos of their classmates. Where their friends will turn them in for fear of repercussions. You do that by treating it like a crime, not by saying “boys will be boys” and ignoring the issue.
Treat it like a crime, and address separately the issue of children being tried as adults and facing lifelong consequences. The reforms needed to our juvenile justice system go beyond this particular crime.
Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.
If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?
If so, how is the psychological effect of a convincing deepfake any different?
What does this have to do with the post you’re replying to?
Every week you have 15 people sitting in a circle hanging on your every word for two whole hours. And they keep coming back. That’s a lot of good friends, man! If they were there in person we’d all wonder if you were a cult leader.
These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.