this post was submitted on 25 Mar 2026
234 points (98.8% liked)
Technology
83032 readers
2943 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”."
Oh fuck right off.
I'm sorry but this is a bad "think of the children" decision. There are limits to what Meta or any platform can do about bad actors at that size without structural changes.
What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.
And improve parental controls for children's accounts. I'm sure there's nothing currently giving a "parent" account high level control over a "child" account, but I'm happy to be corrected if I'm wrong.
But also: require intercompatibility with other platforms and a standardized form of profile data export so people can leave Facebook but stay in touch with the people who still use it.
Parental controls already exist in every major OS, they suffice to restrict & monitor social media, and they go unused.
A better solution might be for laws to provide parents resources & incentives to parent children's online activity (including training to use resources they already have) & to provide children education in online safety & literacy. Decades ago, federal courts citing commission findings & studies recommended these alternatives as superior in effectiveness, meeting government duties to minimize impact on civil liberties, allocation of law enforcement resources, etc. For the permanent injunction to COPA, the judge wrote
Adult supervision, child education on online safety & literacy, parental controls & filters are more effective at less expense to fundamental rights. Governments know this & conveniently forget it.
What actually might help: hold people who design these tools criminally liable. Everyone knows what they are doing but you can't really say no to your employer because "don't worry you're not liable" so everyone continues on building the Torment Nexus.
Unfortunately can’t codify how platforms work soecifically into law.
But you could possibly explicitly make companies liable for promoting “detrimental” content. Then define “promoting” as something like “surfacing content to a user beyond the reach of the users immediate network. Ie algorithmic suggestions or advertising”
You would simply have big groups like "I ❤️ New Mexico" where people will comment on the same posts and interact. If you would limit all the content including comments and likes to users someone personally follows without the ability to discover other users you would turn facebook basically into WhatsApp. It would definitely solve the issue but it would also make the platform look empty and kill it. Which would not necessarily be bad but sadly killing facebook is too radical for anyone to support.