222

Researchers say extreme content being pushed on young people and becoming normalised

top 15 comments
sorted by: hot top controversial new old
[-] postnataldrip@lemmy.world 59 points 9 months ago

It's well-known that these algorithms push topics to drive engagement, and naturally things that make people angry or frightened or disgusted etc enough are more likely to be engaged with regardless of what that topic is.

[-] kat_angstrom@lemmy.world 25 points 9 months ago

When outrage is the prime driver of engagement it's going to push some people right off the platform entirely, and the ones who stay are psychologically worse off for it.

[-] Imgonnatrythis@sh.itjust.works 13 points 9 months ago

Social media execs, "we've done the math and it's worth it"

[-] kat_angstrom@lemmy.world 2 points 9 months ago

Worth it for them, for short term profits. Good thing nobody is considering the net effect this has on society or political discourse.

[-] JoBo@feddit.uk 5 points 9 months ago* (last edited 9 months ago)

They could certainly do with a control group or three. The point they're trying to make is that over 5 days of watching recommended videos the proportion that were misogynistic grew from 13% on day 1 to 52% on day 5. That suggests a disproportionate algorithmic boost but it's hard to tell how much that was caused by the videos they chose to view.

A real world trial ought to be possible. You could recruit thousands of kids to just do their own thing and report back. It's a very hard question to study in the lab because it's nothing like the real world.

[-] small44@lemmy.world 22 points 9 months ago

They push everything negative. I always pick the chronological feed

[-] snooggums@kbin.social 11 points 9 months ago

They push the stuff that people spend more time interacting with. People tend to interact more with negative stuff.

[-] small44@lemmy.world 4 points 9 months ago

Facebook could modify the algorithm to detect if a post is negative and discart them.

[-] bionicjoey@lemmy.ca 4 points 9 months ago

Why would they do that?

[-] snooggums@kbin.social 3 points 9 months ago* (last edited 9 months ago)

They could in theory, but that would drive down engagement and they would make less money.

It is pretty hard to identify negative posts separately from hyperbolic exaggeration though. How do you tell ridiculous rage bait from a good Onion article when the only real difference in context is who posted it?

[-] GarytheSnail@programming.dev 1 points 9 months ago

Just like this sub. The only shit getting posted on it is articles about shitty things happening.

[-] autotldr@lemmings.world 5 points 9 months ago

This is the best summary I could come up with:


Researchers said they detected a four-fold increase in the level of misogynistic content suggested by TikTok over a five-day period of monitoring, as the algorithm served more extreme videos, often focused on anger and blame directed at women.

Meanwhile, the mother of murdered teenager, Brianna Ghey, called for social media apps to be banned on smartphones for under-16s after hearing evidence about the online activities of her daughter’s killers.

Geoff Barton, general secretary of the Association of School and College Leaders, which collaborated with the research, said: “UCL’s findings show that algorithms – which most of us know little about – have a snowball effect in which they serve up ever-more extreme content in the form of entertainment.

“This is deeply worrying in general but particularly so in respect of the amplification of messages around toxic masculinity and its impact on young people who need to be able to grow up and develop their understanding of the world without being influenced by such appalling material.

“We call upon TikTok in particular and social media platforms in general to review their algorithms as a matter of urgency and to strengthen safeguards to prevent this type of content, and on the government and Ofcom to consider the implications of this issue under the auspices of the new Online Safety Act.”

“It couldn’t be clearer that the regulator Ofcom needs to take bold and decisive action to tackle high-risk algorithms that prioritise the revenue of social media companies over the safety and wellbeing of teens.”


The original article contains 963 words, the summary contains 252 words. Saved 74%. I'm a bot and I'm open source!

[-] yeah@lemmy.world 2 points 9 months ago
[-] xc2215x@lemmy.world 3 points 9 months ago

That is a shame.

[-] afunkysongaday@lemmy.world 1 points 9 months ago

I invite everyone to have a critical look at the study. https://www.ascl.org.uk/ASCL/media/ASCL/Help%20and%20advice/Inclusion/Safer-scrolling.pdf

Personally, they lost me on page 12.

this post was submitted on 06 Feb 2024
222 points (94.4% liked)

Technology

59081 readers
3211 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS