this post was submitted on 27 Dec 2025
596 points (98.2% liked)

World News

51485 readers
2635 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
 

More than 20% of the videos that YouTube’s algorithm shows to new users are “AI slop” – low-quality AI-generated content designed to farm views, research has found.

The video-editing company Kapwing surveyed 15,000 of the world’s most popular YouTube channels – the top 100 in every country – and found that 278 of them contain only AI slop.

Together, these AI slop channels have amassed more than 63bn views and 221 million subscribers, generating about $117m (£90m) in revenue each year, according to estimates.

The researchers also made a new YouTube account and found that 104 of the first 500 videos recommended to its feed were AI slop. One-third of the 500 videos were “brainrot”, a category that includes AI slop and other low-quality content made to monetise attention.

The findings are a snapshot of a rapidly expanding industry that is saturating big social media platforms – from X to Meta to YouTube – and defining a new era of content: decontextualised, addictive and international.

you are viewing a single comment's thread
view the rest of the comments
[–] dual_sport_dork@lemmy.world 29 points 1 day ago* (last edited 23 hours ago) (1 children)

A telling thing is that apparently Youtube's algorithm knows that these videos are AI slop. I suspect this because at the outset I was aggressively disrecommending any of these that Youtube suggested to me, and basically nothing like them shows up in my feed anymore. Every once in a while one still slips through, usually some manner of synthetic music thing, and I hit the ol' three dots and choose "not interested" and then "don't recommend this channel" and I never see it again.

What's much more concerning is that the average user (i.e. non tech people, i.e. practically everybody) is being handed this shit by default and in my lifetime of experience of people already being widely unable to distinguish truth from blatant manipulative fantasy, the prevalence of false/misleading/nonsense/fabricated AI bullshit being constantly peddled inches from their eyeballs is absolutely eroding peoples' already limited ability to think.

[–] brbposting@sh.itjust.works 1 points 20 hours ago* (last edited 20 hours ago)

Interesting, wonder if my watch history having some e.g. Penn & Teller: BS! is why they’ve never dared push me slop… OK, brainrot slop at least, maybe something's slipped through.

(Haven’t touched the “don’t recommend” stuff that I can recall over these years… er, decades perhaps, whew)

Obvy could be any one of a million reasons. Could’ve been A/B Test Group B while you were Group A!