this post was submitted on 12 Jul 2025
17 points (94.7% liked)

Lemmy Support

4931 readers
30 users here now

Support / questions about Lemmy.

Matrix Space: #lemmy-space

founded 6 years ago
MODERATORS
 

How can one configure their Lemmy instance to reject illegal content? And I mean the bad stuff, not just the NSFW stuff. There are some online services that will check images for you, but I'm unsure how they can integrate into Lemmy.

As Lemmy gets more popular, I'm worried nefarious users will post illegal content that I am liable for.

you are viewing a single comment's thread
view the rest of the comments
[–] snowe@programming.dev 9 points 1 day ago (2 children)

You can set up Cloudflare as your CDN and turn on CSAM detection. It will automatically block links to known CSAM from the managed global CSAM hash lists.

If you want something in addition to that, you can use db0’s plugin that adds in a similar capability.

[–] vk6flab@lemmy.radio 4 points 1 day ago* (last edited 1 day ago) (1 children)

From your description it's unclear, does this also block CSAM that's physically on your infrastructure, or just any links to external content?

CloudFlare is currently attempting to block LLM bots and doing a shit job at it. I'm guessing that any CSAM blocking would be incomplete at best.

What happens if some "gets through", or if non-CSAM content is blocked, both materially, as-in, what happens, and, what are the legal implications, since I doubt that CloudFlare would ever assume liability for content on your infrastructure.

Edit: I thought I'd also point out that this is not the only type of content that could get you into a legal black hole. For example, if a post was made that circumvented a legal ruling, say when a court in Melbourne, Australia, makes a suppression order that someone breaches. Or if defamatory content was published, etc.

[–] snowe@programming.dev 1 points 11 hours ago

It blocks access to the link on your site. For example, on programming.dev people have uploaded CSAM. The links are immediately blocked (e.g. no one can get to them except an instance owner actually looking in the pictrs database) and then in the CF dashboard you get a notification with the link to the webpage it occurred on.

CSAM blocking works based on a known agreed upon, shared hash list which is created by a consortium of large tech giants. If novel CSAM is uploaded to your instance, then yes, it will fail to catch that. db0’s plugin might catch it though. LLM blocking doesn’t have the benefit of a bunch of multi billion dollar companies trying to stop it, in fact they’re doing the exact opposite, so yes LLM blocking sucks.

For your edit, I would expect you to have an email set up that you would get the notice from. You are not responsible for this kind of stuff until you have been notified, pretty much globally, so pay attention to your email.