this post was submitted on 28 Aug 2023
283 points (99.3% liked)

Self Hosted - Self-hosting your services.

11587 readers
29 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules

Important

Beginning of January 1st 2024 this rule WILL be enforced. Posts that are not tagged will be warned and if not fixed within 24h then removed!

Cross-posting

If you see a rule-breaker please DM the mods!

founded 3 years ago
MODERATORS
 

There have been users spamming CSAM content in !lemmyshitpost@lemmy.world causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.

I deleted every image from the past 24 hours personally, using the following command: sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec shred {} \;

Note: Your local jurisdiction may impose a duty to report or other obligations. Check with these, but always prioritize ensuring that the content does not continue to be served.

Update

Apparently the Lemmy Shitpost community is shut down as of now.

all 35 comments
sorted by: hot top controversial new old
[–] mlfh@lemmy.ml 95 points 1 year ago (3 children)

If you aren't going to fully wipe your drive in horrible events like this, at the very least use shred instead of rm. rm simply removes references to the file in the filesystem, leaving the data behind on the disk until other data happens to be written there.

Do not ever allow data like that to exist on your machines. The law doesn't care how it got there.

[–] Mic_Check_One_Two@reddthat.com 31 points 1 year ago* (last edited 1 year ago) (1 children)

Was going to say the same. Windows and Linux both use “lazy” ways of deleting things, because there’s not usually a need to actually wipe the data. Overwriting the data takes a lot more time, and on an SSD it costs valuable write cycles. Instead, it simply marks the space as usable again, and removes any associations to the file that the OS had. But the data still exists on the drive, because it’s simply been marked as writeable again.

There are plenty of programs that will be able to read that “deleted” content, because (again) it still exists on the drive. If you just deleted it and haven’t used the drive a lot since then, it’s entirely possible that the data hasn’t been overwritten yet.

You need a form of secure delete, which doesn’t just mark the space is usable. A secure delete will overwrite the data with junk data. Essentially white noise 1’s and 0’s, so the data is completely gone instead of simply being marked as writeable.

[–] lazynooblet@lazysoci.al 3 points 1 year ago (2 children)

Would rm be okay if you regularly fstrim?

[–] alanceil@lemmy.world 4 points 1 year ago* (last edited 1 year ago) (1 children)

No, fstrim just tells your drive it doesn't need to care about existing data when writing over it. Depending on your drive, direct access to the flash chips might still reveal the original data.

If you want ensure data deletion, as OP said, you'll need to zero out the whole drive and then fstrim to regain performance. Also see ATA Secure Erase. Some drives encrypt by default and have Secure Erase generate a new key. That will disable access to the old data without having to touch every bit.

Or physically destroy the whole drive altogether.

[–] Zacryon@feddit.de 2 points 1 year ago

TRIM tells the SSD to mark an LBA region as invalid and subsequent reads on the region will not return any meaningful data. For a very brief time, the data could still reside on the flash internally. However, after the TRIM command is issued and garbage collection has taken place, it is highly unlikely that even a forensic scientist would be able to recover the data.

From: https://en.m.wikipedia.org/wiki/Trim_(computing)#Operation

So: probably yes.

[–] Anaralah_Belore223@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

The only 100% foolproof-way is to physically destroy the server disk where that image is stored. Do not place those drive fragments in a recycling center, landfill.

[–] lea@feddit.de 63 points 1 year ago (3 children)

I nuked my personal instance because of this :(

Dealing with pictrs is just frustrating currently since there's no tools for its database format and no frontend for the API. I half-expected this outcome but I hope it gets better in the future.

[–] skullgiver@popplesburger.hilciferous.nl 33 points 1 year ago* (last edited 1 year ago) (2 children)

[This comment has been deleted by an automated system]

[–] Toribor@corndog.social 1 points 1 year ago (1 children)

Pict-rs has been the single largest pain of self-hosting a tiny Lemmy instance. I really hope things improve. I like hosting it myself but I can't do it as a second job, having to figure out my own hacks and workarounds just to keep it running and not serving up illegal crap.

[–] skullgiver@popplesburger.hilciferous.nl 1 points 1 year ago* (last edited 1 year ago) (1 children)

[This comment has been deleted by an automated system]

[–] Toribor@corndog.social 2 points 1 year ago

Thank you! I was looking into running this a week or two ago when I was doing some maintenance but I gave up and shelved the project for later due to the complexity. My Lemmy instance is running in AWS and I'm going to have to put some work into my network setup on both ends to be able to connect to a computer with a GPU at home.

I'm glad the community is working to resolve some of these issues. Hopefully some of this will get easier and more cost-effective.

[–] zahel@cosmere.xyz 20 points 1 year ago (1 children)

yeah this has got me second guessing hosting my own instance as well.

[–] clearedtoland@lemmy.world 21 points 1 year ago (1 children)

That finalized my decision to not self-host. I’m savvy enough to set it up but not enough to keep up with maliciousness like this. I’d never even considered a deliberate CSAM attack as a possibility - I thought it was just something (atrocious) users might inadvertently post.

[–] SkyeStarfall@lemmy.blahaj.zone 10 points 1 year ago

You always gotta prepare for the worst case. It's certainly why I am never going to bother with hosting something like this unless I'm serious about it akin to a job. If there's even a remote chance of CASM getting on your machine, you gotta assume it will and be prepared to fight to prevent it/remove it.

[–] fmstrat@lemmy.nowsci.com 10 points 1 year ago (1 children)

Agreed, pict-rs is not ready for this. Not having an easy way to map URL to file name is a huge issue. I still don't understand why non-block storage doesn't just use the UUID it generates for the URL as a filename. There is zero reason to not have a one-to-one mapping.

[–] ohai@subsubd.com 2 points 1 year ago* (last edited 1 year ago)

yeah, I just spent the last hour writing some python to grab all the mappings via the pict-rs api. Didn't help that the env var for the pictrs api token was named incorrectly (I should probably make a PR to the Lemmy ansible repo). EDIT: Nevermind, seems there is one already! https://github.com/LemmyNet/lemmy-ansible/pull/153

[–] UntouchedWagons@lemmy.ca 28 points 1 year ago

I'm not surprised. It was quite common for shitheads on reddit to make an account, post a few comments on /r/againsthatesubreddits, then post CP on other subreddits to spin the narrative that AHS was trying to shut down hate subs.

[–] state_electrician@discuss.tchncs.de 23 points 1 year ago (3 children)

What's a CSAM attack? Sounds so serious, but I've never heard of it.

[–] nachtigall@feddit.de 21 points 1 year ago

Spamming pornographic depictions of minors

[–] IIIIII@sh.itjust.works 17 points 1 year ago (1 children)

I had to google it but that stands for child sexual abuse material

[–] Cypher@lemmy.world 14 points 1 year ago (1 children)

It is where scum spam a site with illegal images, which can result in the site being taken down and in some instances the site owners being prosecuted.

Depending on where you live you may have a legal obligation to report the incidents and to prove actions taken to remove the content.

[–] cactusupyourbutt@lemmy.world 3 points 1 year ago

related in the US: safe harbor laws

[–] Pratai@lemmy.ca 22 points 1 year ago (1 children)

What kind of depraved piece of shit does this?

[–] argv_minus_one@beehaw.org 16 points 1 year ago

Pedophiles ruin everything.

[–] thisisawayoflife@lemmy.world 5 points 1 year ago

Naive question here: would it be valuable to generate hashes of those images and provide them as a public database? Seems like it would be valuable to reject known images using some mechanism to prevent this from happening broadly. It wouldn't stop someone from on-the-fly systematically editing/saving/uploading CSAM, but hashes are cheap to store and it would at least provide one barrier to entry.

[–] Atalocke@lemmy.basedcount.com 2 points 1 year ago

If anyone is looking for the NCME reporting registration.

[–] jaz@mastodon.iftas.org 1 points 1 year ago

@Jamie some recommended reading here for hosting ActivityPub services: https://github.com/FediFence/fedifence/blob/main/LegalRegulatory.md

Cloudflare has a free CSAM filter: https://developers.cloudflare.com/cache/reference/csam-scanning/

IFTAS is working on an opt-in CSAM scanner for service providers, follow this account to be notified

Lemmy moderators should fill out this needs assessment: https://cryptpad.fr/form/#/2/form/view/thnEBypiNlR6qklaQNmWAkoxxeEEJdElpzM7h2ZIwXA/