view the rest of the comments
Men's Liberation
This community is first and foremost a feminist community for men and masc people, but it is also a place to talk about men’s issues with a particular focus on intersectionality.
Rules
Everybody is welcome, but this is primarily a space for men and masc people
Non-masculine perspectives are incredibly important in making sure that the lived experiences of others are present in discussions on masculinity, but please remember that this is a space to discuss issues pertaining to men and masc individuals. Be kind, open-minded, and take care that you aren't talking over men expressing their own lived experiences.
Be productive
Be proactive in forming a productive discussion. Constructive criticism of our community is fine, but if you mainly criticize feminism or other people's efforts to solve gender issues, your post/comment will be removed.
Keep the following guidelines in mind when posting:
- Build upon the OP
- Discuss concepts rather than semantics
- No low effort comments
- No personal attacks
Assume good faith
Do not call other submitters' personal experiences into question.
No bigotry
Slurs, hate speech, and negative stereotyping towards marginalized groups will not be tolerated.
No brigading
Do not participate if you have been linked to this discussion from elsewhere. Similarly, links to elsewhere on the threadiverse must promote constructive discussion of men’s issues.
Recommended Reading
- The Will To Change: Men, Masculinity, And Love by bell hooks
- Politics of Masculinities: Men in Movements by Michael Messner
Related Communities
!feminism@beehaw.org
!askmen@lemmy.world
!mensmentalhealth@lemmy.world
I tried youtube shorts a few times and it kept recommending me very right wing content including andrew tate. I always swipe away as fast as I possibly can and it still is very insistent on showing me that type of content. Youtube even knows that I am a woman and I don't watch anything even remotely similar to what it wants to show me so I don't really understand why it is so insistent on showing me misogynistic content. I guess they just don't want me to use their platform anymore?
I'm pretty certain the shorts algorithm is kind of "its own thing" in a lot of ways. It's a prime "your mileage may vary" system, and because so many right wing creators upload to it, it's basically a numbers game unless you get lucky with the algorithm when it's first getting a handle on your preferences.
While I don't know this for certain, the only really effective way to get the algorithm to stop showing you something is to literally close the app for a while when it puts one in front of you. Combined with searching up shorts for the stuff you want, I think it's possible but it's really persistent if it thinks it should show you specific kinds of content.
At the end of the day, however, they're machine learning models, and while we can gesture at trends, nobody knows the full ins and outs of how a specific model makes its decisions. Kind of scary that we trust them to the role of curation in the current environment at all to be honest
I just checked shorts on my phone and they have like and dislike buttons. Does it still show you that stuff if you press dislike? In the top right they have a ... with options for "not interested" and "block channel" which should fully remove it if it's the same channel being recommended. I don't have any of that stuff on my feed probably because I blocked them the first time I saw them. If you didn't know those options existed then they clearly have a UI problem.
If you don't like a video you should just swipe away because the algorithm is looking for engagement and even though disliking it is negative it is still engagement and it will cause YouTube to show that content to more people. It can use your dislike to determine if that video is good or not for people similar to you, so your dislike might change who it is suggested to, but it won't cause it to be suggested less. At least that is what I have heard, I don't know if that is actually true.
Of course they will try and figure out what demographics like which videos, that's not always a bad thing, some people are not interested in video game content, that doesn't mean it's bad, it's just a different demographic. They're going to use whatever signals they can find to figure out those clusters, if you don't engage but another demographic engages heavily they'll keep recommending it to them. Giving them less signal won't make it less popular, it will just give you worse recommendations.
If the goal is to prevent this harmful content from being recommended at all then I think it is important to just ignore it. I don't need to use youtube shorts and if it's going to be terrible then I'm just never going to interact with it anymore.
I don't think that will achieve the goal.