196
submitted 1 year ago* (last edited 1 year ago) by corb3t@lemmy.world to c/technology@lemmy.ml

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

you are viewing a single comment's thread
view the rest of the comments
[-] Dubious_Fart@lemmy.ml 3 points 1 year ago

No, cause then you end up with a case like the guy who lost 15+ years of emails, his phone number, all his photos, his contacts, and everything else he had tied to a google account, because Googles automated detection triggered on a naked photo of their baby, that they sent to the doctor during covid, that the doctor requested, about a rash on the babies diaper area.. and no amount of common sense would stay their hand or reverse their ignorant judgement that this man was a child pornographer, and even called the police on him.

this post was submitted on 24 Jul 2023
196 points (79.3% liked)

Technology

34973 readers
107 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS