this post was submitted on 25 Jul 2023
119 points (84.0% liked)

Fediverse

28409 readers
1108 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

founded 1 year ago
MODERATORS
 

Not the best news in this report. We need to find ways to do more.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] etrotta@kbin.social 5 points 1 year ago (1 children)

It is not "in the whole fediverse", it is out of approximately 325,000 posts analyzed over a two day period.
And that is just for known images that matched the hash.

Quoting the entire paragraph:

Out of approximately 325,000 posts analyzed over a two day period, we detected
112 instances of known CSAM, as well as 554 instances of content identified as
sexually explicit with highest confidence by Google SafeSearch in posts that also
matched hashtags or keywords commonly used by child exploitation communities.
We also found 713 uses of the top 20 CSAM-related hashtags on the Fediverse
on posts containing media, as well as 1,217 posts containing no media (the text
content of which primarily related to off-site CSAM trading or grooming of minors).
From post metadata, we observed the presence of emerging content categories
including Computer-Generated CSAM (CG-CSAM) as well as Self-Generated CSAM
(SG-CSAM).

[โ€“] Rivalarrival@lemmy.today 6 points 1 year ago

How are the authors distinguishing between posts made by actual pedophiles and posts by law enforcement agencies known to be operating honeypots?