this post was submitted on 12 Jul 2025
17 points (94.7% liked)
Lemmy Support
4931 readers
30 users here now
Support / questions about Lemmy.
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can set up Cloudflare as your CDN and turn on CSAM detection. It will automatically block links to known CSAM from the managed global CSAM hash lists.
If you want something in addition to that, you can use db0’s plugin that adds in a similar capability.
From your description it's unclear, does this also block CSAM that's physically on your infrastructure, or just any links to external content?
CloudFlare is currently attempting to block LLM bots and doing a shit job at it. I'm guessing that any CSAM blocking would be incomplete at best.
What happens if some "gets through", or if non-CSAM content is blocked, both materially, as-in, what happens, and, what are the legal implications, since I doubt that CloudFlare would ever assume liability for content on your infrastructure.
Edit: I thought I'd also point out that this is not the only type of content that could get you into a legal black hole. For example, if a post was made that circumvented a legal ruling, say when a court in Melbourne, Australia, makes a suppression order that someone breaches. Or if defamatory content was published, etc.
It blocks access to the link on your site. For example, on programming.dev people have uploaded CSAM. The links are immediately blocked (e.g. no one can get to them except an instance owner actually looking in the pictrs database) and then in the CF dashboard you get a notification with the link to the webpage it occurred on.
CSAM blocking works based on a known agreed upon, shared hash list which is created by a consortium of large tech giants. If novel CSAM is uploaded to your instance, then yes, it will fail to catch that. db0’s plugin might catch it though. LLM blocking doesn’t have the benefit of a bunch of multi billion dollar companies trying to stop it, in fact they’re doing the exact opposite, so yes LLM blocking sucks.
For your edit, I would expect you to have an email set up that you would get the notice from. You are not responsible for this kind of stuff until you have been notified, pretty much globally, so pay attention to your email.