this post was submitted on 07 Dec 2025
308 points (97.8% liked)
Technology
77427 readers
2901 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
What does that even mean?
That all still seems like catastrophizing over videos, images, text on a screen that can't compel action or credible harm. I expect that lawsuit to go nowhere.
Videos, images, and text can absolutely compel action or credible harm.
For example, Facebook was aware that Instagram was giving teen girls depression and body image issues, and subsequently made sure their algorithm would continue to show teen girls content of other girls/women who were more fit/attractive than them.
https://www.congress.gov/117/meeting/house/114054/documents/HHRG-117-IF02-20210922-SD003.pdf
https://www.reuters.com/business/instagram-shows-more-eating-disorder-adjacent-content-vulnerable-teens-internal-2025-10-20/
Many girls have committed suicide or engaged in self harm, at least partly inspired by body image issues stemming from Instagram's algorithmic choices, even if that content is "just videos, and images."
They also continued to recommend dangerous content that they claimed was blocked by their filters, including sexual and violent content to children under 13. This type of content is known to have a lasting effect on kids' wellbeing.
https://time.com/7324544/instagram-teen-accounts-flawed/
In the instance you specifically highlighting, that was when Meta would recommend teen girls to men exhibiting behaviors that could very easily lead to predation. For example, if a man specifically liked sexual content, and content of teen girls, it would recommend that man content of underage girls attempting to make up for their newly-created body image issues by posting sexualized photos.
They then waited 2 years before implementing a private-by-default policy, which wouldn't recommend these teen girls' accounts to strangers unless they explicitly turned on the feature. Most didn't. Meta waited that long because internal research showed it would decrease engagement.
https://techoversight.org/2025/11/22/meta-unsealed-docs/
If I filled your social media feed with endless posts specifically algorithmically chosen to make you spend more time on the app while simultaneously feeling worse about yourself, then exploited every weakness the algorithm could identify about you, I don't think you'd look at that and say it's "catastrophizing over videos, images, text on a screen that can’t compel action or credible harm" when you develop depression, or worse.
If you seriously think that "videos, images, text on a screen can't compel action" then you've just revoked every single right you had to be part of this discussion.
Hold on is your veiw that video?/text can not be directly harmful or that nothing on meta platforms is?