this post was submitted on 29 Mar 2026
174 points (97.3% liked)

Technology

6465 readers
201 users here now

Which posts fit here?

Any news that are at least tangentially connected to the technology, social media platforms, informational technologies or tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

Children’s media specialists are sounding the alarm over AI YouTube videos that are supposed to be “educational” but are harmful.

you are viewing a single comment's thread
view the rest of the comments
[–] lauha@lemmy.world 39 points 2 days ago (3 children)

Specialists should be sounding alarm about people letting their children watch youtube unattended. Even youtube kids is a dangerous shitshow.

[–] clubb@lemmy.dbzer0.com 20 points 2 days ago (2 children)

YouTube kids is far more dangerous than regular YouTube.

[–] Hacksaw@lemmy.ca 5 points 2 days ago (3 children)

I hear this a lot, but you have to put it in context. It used to be you could let your kids play outside in a nice neighbourhood. Your job as a parent was to make sure they went to play in a nice neighbourhood and at the houses of decent people. You could easily keep them away from bad places physically because they were separate places. Your neighbours would also tell you if they saw your kid in a bad place or being up to no good.

The Internet destroys that concept. The good and the bad are one link away, you need constant vigilance and you have almost no help. It's not healthy to micromanage your children's media consumption. It's like helicopter parents who never let their kids free. Setting this as the expectation isn't healthy.

I mean we don't really have a choice, but acting like it's okay for YouTube to lure my young kids into red pill content, or weird AI nonsense is pretty weird. Why are we just accepting this reality, should we not have some control over our algorithms. It's basically what our neighbourhood used to be. Why are we saying it's okay for YouTube to lure kids into dangerous content, and that is every parent's job to constantly micromanage their kids media consumption as if that's healthy parenting? It's SURVIVAL parenting, not healthy parenting!

We should be able to control our algorithms and help our kids control their algorithms because the solution isn't constant fear and vigilance lest we get taken by the billionaire class and their dangerous ideology.

It's not normal that we created a space so fundamentally unsafe for kids. Very few physical spaces are like this in real life and I think you should try to imagine what would happen if a kid walked into a "non-kid" space like a sex shop or whatever. Because it's not let the kid have unlimited access to porn and kink while we blame the parents. It's usually a human worker working with the kid to get back to safety (usually their parents).

[–] TubularTittyFrog@lemmy.world 1 points 1 day ago (1 children)

a kid wanting into a sex shop isn't going to be warped and traumatized for life. jesus.

you are looking for demons where they are none. Not to mention 'good' vs 'bad' people is going to be loaded with racism and classism. My parents thought the 'bad kids' were the ones who were black and brown, but had no issue with me hanging out with the white kdis who were doing vandalism and ended up with arrest records...

[–] Hacksaw@lemmy.ca 2 points 1 day ago* (last edited 1 day ago)

I never said a sex shop was going to be bad, I mostly said the opposite. I used it as an example that when a child walks into a physical space meant for adults, the community helps. When a child walks into a digital space meant for adults it's expected their parent was watching at all times and failed.

I also completely agree with your assessment of good and bad places/neighborhoods being often used as a cover for classism and racism.

That being said there are places near here I would not let young kids play unsupervised because of the crime rate, homelessness and open drug use. There are places near here I do let kids of an appropriate age okay unsupervised because they are nice safe parks and areas even if they're poor neighborhoods. In fact some of these I feel are safer because there are more kids playing and parents aren't shy to tell other kids off when they misbehave like they are in "rich" neighbourhoods.

Same with people's houses. Obviously looking for "shared values" can be a cover for racism, but I'm not a cis-white-straight-nt-male looking for a socio-normative house. I'm not looking for them to be white and rich, I'm looking for parents who care about their kids without being too helicopter-y.

I think you read my message backwards. I meant to say that physical spaces are usually safe for kids even the spaces meant to be adults only. In digital spaces we accomplished the opposite where most spaces are dangerous, even "kids spaces". But instead of seeing this as a problem caused purposefully by the companies creating and curating these spaces to maximize profit and right wing ideology, we blame only parents for not micro managing their kids. I see digital hypervigilant supervision as a parenting survival strategy rather than a good long term solution. We need more control over our algorithms and digital spaces so that they're safe-by-default like physical spaces are.

[–] GenosseFlosse@feddit.org 1 points 1 day ago

The Internet destroys that concept. The good and the bad are one link away, you need constant vigilance and you have almost no help. It’s not healthy to micromanage your children’s media consumption. It’s like helicopter parents who never let their kids free. Setting this as the expectation isn’t healthy.

I agree, but with all the brand recognition, corporate TOS, rules, "kids" section, design, beeping off swear words and restrictions on YouTube it gives most parents the appearance to be a save place.

[–] harmbugler@piefed.social 1 points 1 day ago (1 children)

I don't really understand your point here. I have to micromanage my kid's media consumption because on YouTube it's Google's algorithm and my kid is not their customer.

[–] TubularTittyFrog@lemmy.world 1 points 1 day ago

The point is paranoia and fear that the 'bad people' will hurt your kids.

Same stupid crap that you can't let your kids play on your lawn because there are always pedos in white vans patrolling trying to kidnap your kids.

[–] degen@midwest.social 3 points 1 day ago

This but 10 years ago for real