[-] kardum@lemmy.blahaj.zone 0 points 1 year ago

I thought about this some more and I can feel a lot more sympathy for your decision now.

It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.

Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.

It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.

I'm sorry for coming across as ignorant, I just did not consider your perspective that much really.

[-] kardum@lemmy.blahaj.zone 1 points 1 year ago

Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.

I had classmates in high school with balding or even graying hair and full beards. Some adults older than me, look younger than my nephews. Revenge porn and creepshots are common. (or atleast were, I'm not on platforms where these are popular)

Without context, porn will always be a morally grey area. Even commercialized hyper-capitalist porn is still an intimate affair.

That's why I didn't use pornhub for example, before every user had to verify themselves before posting. Before that I only read erotica or looked at suggestive drawings.

I understand your perspective tho. You get hardly paid to keep this instance running, looking at pictures that without context could be CSAM could make this volunteer work very mentally taxing. This is how NSFW works tho.

Without context, any pornographic material featuring real humans could in truth be some piece of evidence for a horrible crime.

[-] kardum@lemmy.blahaj.zone 2 points 1 year ago* (last edited 1 year ago)

i had no problem distinguishing the models on the community from children.

maybe it's more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that's just human anatomy.

that's why the guy at the gas station asks for my ID card, because it is not always super clear. but apparently clear enough for reddit admins and PR people from ad companies.

i agree playing into the innocent baby aspect is probably not great for sexual morals and i wouldn't recommend this comm to a local priest or a nun, but this type of content thrives on pretty much every mainstream platform in some shape or form.

i get it, if this instance wants to be sexually pure and removed from evil carnal desires tho. that's kind of cool too for sure.

[-] kardum@lemmy.blahaj.zone 4 points 1 year ago* (last edited 1 year ago)

the same community (adorableporn) is also on reddit btw with 2.2m subscribers.

i have no grand moral opinion on this type of content. for me it is the same as femboy content for example, where people also push for a youthful, girly aesthetic.

as long as the content is made by consenting verified adults, i don't care.

it's like adults cosplaying with japanese school uniforms or calling your partner "mommy" or "daddy".

probably not the best move in terms of sexual morals for sure, in the grand scheme of things tho this is just how people express their sexuality i guess.

kardum

joined 1 year ago