this post was submitted on 28 Aug 2023
1029 points (97.2% liked)

Memes

45657 readers
1324 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] rockSlayer@lemmy.world 118 points 1 year ago (1 children)

Tell your friend to log the IP address and report it to the authorities. They might need to turn over the entire modlog as well

[–] YIj54yALOJxEsY20eU@lemm.ee 55 points 1 year ago

This is likely in reference to the federation of such images posted elsewhere

[–] db2@sopuli.xyz 76 points 1 year ago (1 children)

There's always someone who doesn't mind ruining it for everyone else. Probably safest to just delete all the images, that way there's no need to look.

[–] Szymon@lemmy.ca 63 points 1 year ago (1 children)

Bad actors will try to nuke the entire platform to maintain a monopoly on this format of communication and community.

[–] andrew@lemmy.stuart.fun 35 points 1 year ago (1 children)

Who could you posspezibly be referring to?

[–] Etienne_Dahu@jlai.lu 3 points 1 year ago

Is it the android? The lone skum? Or someone else entirely?

[–] acastcandream@beehaw.org 65 points 1 year ago

Once again reaffirming why I refuse to host an instance. If I ever do, I’m not federating with any of you degenerates lol

[–] 01189998819991197253@infosec.pub 20 points 1 year ago (1 children)

I'm glad s/he was able to nuke the CSAM, even if other material was nuked with it. This crap is why I'm not hosting.

Please, call it CSAM (child sexual abuse material) and not CP (child pornography). The children in these photos/videos can't make pornography, they're sexually abused into making this material. CP insinuates that it's legitimate porn with children. CSAM, on the other hand, calls it what it is: sexual abuse of children.

[–] Tranus@programming.dev 32 points 1 year ago (10 children)

That is needlessly pedantic. I have never heard of anyone using the word pornography to imply legality or moral acceptability. There is no such thing as "legitimate" CP, so there is no need to specify that it's not ok every time it is mentioned. No one in their right mind would presume he's some kind of CP supporting monster for failing to do so.

[–] TheFrirish@jlai.lu 12 points 1 year ago

If we spent more time fixing things rather than naming them the world would be a better place.

[–] 01189998819991197253@infosec.pub 4 points 1 year ago* (last edited 1 year ago) (2 children)

No one in their right mind would assume that OP is. But the term was created to legitimize the material. So, while you're correct in that it is picky, it is also picky for a reason. Words are powerful. We should fight to not empower the legitimation of that term, among other things.

load more comments (2 replies)
load more comments (8 replies)
[–] Andrew15_5@mander.xyz 15 points 1 year ago (1 children)
[–] neeeeDanke@feddit.de 9 points 1 year ago

I know that guy Tobias Fünke, althought he also is a analysist. He had some clever abreviation for that as well!

[–] A10@kerala.party 13 points 1 year ago

Bless you ❤️

[–] pinkdrunkenelephants@sopuli.xyz 12 points 1 year ago (2 children)

I'm not gonna lie, I'm surprised it took this long for some dipshit to try something like this. Lemmy's security has more holes in it than a piece of Swiss cheese and we're fools if we think it's viable enough for it to serve as a long-term home for new social media.

We really, really need a better social structure than federation.

[–] KairuByte@lemmy.dbzer0.com 16 points 1 year ago (1 children)

Lemmy’s security has more holes in it than a piece of Swiss cheese

This has very little to do with security. There's inherently "insecure" about posting CSAM, since the accounts and images were likely posted just like any other.

What really needs to happen, is some sort of detection of that kind of content (which would likely require a large change to code) or additional moderation tools.

[–] pinkdrunkenelephants@sopuli.xyz 5 points 1 year ago (1 children)

The lack of those tools is what I was talking about

[–] KairuByte@lemmy.dbzer0.com 11 points 1 year ago (1 children)

Ah okay, those arent generally considered security but I can understand why you went that route I suppose.

[–] pinkdrunkenelephants@sopuli.xyz 3 points 1 year ago (1 children)

Does anyone know why they were never put in?

[–] KairuByte@lemmy.dbzer0.com 6 points 1 year ago

Software development is a balancing act. You need to pick and choose not only what features to add, but when to add them. Sometimes, mistakes are made in the planning and you get a situation like this.

What likely happened, is that these kinds of features were deemed less likely to be needed, since the majority of lemmy users will never run into the need of them and there is technically a way to handle the situation (nuking your instances image cache.) But you'll likely see a reshuffling of priorities if these kinds of attacks become more prevalent.

[–] lemann@lemmy.one 9 points 1 year ago (1 children)

Lemmy's security

I think you mis-spelled moderation tools, nice quick fix would have been to block posts from new users on X instance and have a pinned post briefly covering why - they'll eventually run out of instances that don't have open signups IMO or just give up.

Another mod tools option would be rate limiting of posts, i.e. users can only make a new shitpost every 10-15min, rather than unlimited times per minute

load more comments (1 replies)
[–] csolisr@communities.azkware.net 10 points 1 year ago (1 children)

In the meanwhile, my YunoHost based instance that still hasn't managed to make Pict-RS work and therefore can't even store images even if it wanted to is doing juuuuust fine

[–] Etienne_Dahu@jlai.lu 6 points 1 year ago

Come to think of it, if you're the only user, it's kinda protecting you, isn't it? (hello fellow Yunohost user!)

load more comments
view more: next ›