this post was submitted on 31 Jan 2026
63 points (95.7% liked)

Fediverse

39858 readers
195 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, Mbin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration)

founded 2 years ago
MODERATORS
 

Features I can think of:

  • a system for stricter content moderation, especially something that would automatically delete NSFW/NSFL posts,
  • no direct messaging,
  • some kind of tool for moderators to efficiently review content,
  • multi-layered access to an account to allow for parental control,
  • time management tool that would not be based on the client, but with the session duration calculated through interactions.
you are viewing a single comment's thread
view the rest of the comments
[–] biofaust@lemmy.world 3 points 5 days ago (3 children)

Question is, is there anyone that is even suggesting to build a moderation tool for, for example, Mastodon, aimed at this specific need?

[–] Warl0k3@lemmy.world 11 points 5 days ago* (last edited 5 days ago)

Right now the moderation tools available appear to be on the level of "thog bang rock on other rock - make smaller rock, easy to eat" and efforts are primarily aimed at introducing Thog to the concept of fire so he can at least cook his rocks.

The few automoderators I've seen attempted have been "ban you over a couple downvotes" bad, so AI content moderation seems like it may be a bit ambitious right now. It's a good idea, but more work needs to be done before we're at the point it's feasible to start working on it. Like teaching Thog the concept of nuance, and getting him to stop trying to eat the keyboard.

[–] Scipitie@lemmy.dbzer0.com 9 points 5 days ago

I'm not talking about usability, just about the foundation. Besides what others already said about why it's not a good idea to answer your specific question regarij moderation tooling is:

Your requirements are incompatible with decentralization. Every moderation tool will have to use the network itself which means a moderation event has a significant delay in which the content has a "head start".

There is no way to have an instant kill switch for content or a centralized gated release of content.

And at the end everyone can spin up an instance and decide on moderation, after all - and decide on the moderation rules there. This will cause an even bigger delay until the malicious instance is blacklisted by others.

[–] surewhynotlem@lemmy.world 2 points 5 days ago

Moderation is the wrong answer. White listing is the right answer.