13

This is entirely the fault of the IWF and Microsoft, who create "exclusive" proprietary CSAM prevention software and then only license big tech companies to use it.

you are viewing a single comment's thread
view the rest of the comments
[-] BootlegHermit@kbin.social 1 points 11 months ago

To me it seems like a push towards the whole "own nothing" idea. Whether it's something like CSAM detection or even mundane SaaS, things are slowly shifting away from the end user having control over their "own" devices.

I'm torn, because on the one hand, pedophiles and child abusers deserve the severest of consequences in my opinion; on the other hand, I also think that people should be able to do and/or say whatever they want so long as its not causing actual harm to another.

[-] elscallr@kbin.social 1 points 11 months ago

It's much more likely it's a matter of preventing their detection technology from falling into the hands of people that would wish to circumvent it.

this post was submitted on 25 Jul 2023
13 points (100.0% liked)

Fediverse

17 readers
2 users here now

This magazine is dedicated to discussions on the federated social networking ecosystem, which includes decentralized and open-source social media platforms. Whether you are a user, developer, or simply interested in the concept of decentralized social media, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as the benefits and challenges of decentralized social media, new and existing federated platforms, and more. From the latest developments and trends to ethical considerations and the future of federated social media, this category covers a wide range of topics related to the Fediverse.

founded 1 year ago