this post was submitted on 13 Nov 2025
124 points (97.0% liked)

World News

50787 readers
1522 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ranzispa@mander.xyz 9 points 1 day ago (3 children)

Whether they are responsible for what gets posted is exactly the discussion point. There are different ideas on this. Different countries and different people have different opinions on this point.

Generally, website are not really in favor of this idea - and it's not only big tech I'm talking about.

My personal opinion is I don't really like this idea. Having the websites responsible of what gets posted means the websites necessarily have to do some censoring. I'm not necessarily against censoring, but I don't like the idea it is a large private company deciding what to censor. I'd much rather have the government decide and impose the ban on companies.

Moreover, forcing websites to censor things leads to a very centralised internet. The random guy setting up a forum can not afford to patrol that well how website, while big companies indeed can have teams of people doing just that.

[–] JackbyDev@programming.dev 4 points 1 day ago

My understanding is that this is partly why the DMCA works the way it does. If the site takes content down then they're protected from claims made by the copyright holder. It's a way to acknowledge "hey, you're sort of responsible for what gets put up, but we know mistakes happen." (Though, more often than not, it's just used as a way to silence people. Not saying DMCA is perfect or good.)

[–] Ashtear@piefed.social 1 points 1 day ago (1 children)

I don't think it's necessarily true that it's so overly onerous that it must lead to centralization. I'm part of staff on a medium-sized Discord server and we have more than enough coverage to handle objectionable content. As another example, Fediverse instances here have proactively established rules and norms for NSFW content that ensures the communities keep running, and most are still going a few years later after the Reddit exodus exploded their populations.

It absolutely does make scaling up more expensive, but I've gone from a fairly libertarian stance on this to now asserting proper community moderation has become part of the social responsibility corps have now when making these spaces grow to have massive reach. And yes, I don't think big corps do enough on this topic, and it's another inequity because it's really starting to look like new organizations are going to have to be more responsible for what content they allow. But I'm all for coming down hard on the big corps. Everyone got by just fine in the 90's and 2000's when they had much, much larger customer support/moderator staffs. "It's too expensive" is the same garbage excuse used for other forms of enshittification today while these platforms make money hand over fist.

[–] TheOctonaut@mander.xyz 1 points 1 day ago (1 children)

Do you understand that Discord is a massively centralised web app?

[–] Ashtear@piefed.social 1 points 1 day ago (1 children)

In terms of content moderation? Not in the least. Discord is extremely laissez-faire about it. They intervene when compelled to by law enforcement. I can't speak to the large server experience, but we've run our server for nearly six years and not a single member of staff has ever spoken with anyone at Discord. Every single one of the moderation tools we use are third party.

As a practical matter, Discord provides the communication infrastructure for us and that's literally it. Irrelevant to the topic at hand. We could pack up and move it all to Matrix tomorrow and our content moderation experience would be just as centralized (that is to say, it would not be).

[–] TheOctonaut@mander.xyz -1 points 1 day ago* (last edited 1 day ago)

You don't understand the concepts of liability or of server-side administration and monitoring, which are two very diverse areas to speak so confidently incorrectly about

[–] pelespirit@sh.itjust.works -4 points 1 day ago (1 children)

So you're saying that if a mod posted child porn, that the website should leave it up and let the chips fall where they may? That the website isn't responsible if it stays up?

[–] ranzispa@mander.xyz 5 points 1 day ago

What I am saying is that it should not be up to a website company to decide whether something is legal or not. In all other businesses this has always been related to a judge deciding whether something was legal or not. A newspaper is something related, in which case the editor has some responsibility if he lets something clearly illegal slip, however the responsibility falls on the journalist and not on the newspaper itself.

Frankly, I do not want social media - which is currently the main source of information for many and likely most people - to be justified in deciding what should be allowed and what should not. If someone uses such platforms to do something illegal, there are indeed legal methodologies to deal with that.