84
submitted 11 months ago by seasonone@opidea.xyz to c/technology@beehaw.org
top 20 comments
sorted by: hot top controversial new old
[-] NaoPb@beehaw.org 36 points 11 months ago

But they do fuel polarization in general. I'm sure of it.

[-] bownage@beehaw.org 50 points 11 months ago

Actually one of the conclusions from both the Science and Nature articles were that they mostly fuel far right radicalisation, not so much polarisation (which implies both ends of the political spectrum). Which I guess means leftists are generally either more capable of spotting misinformation or less inclined to act on it.

[-] PostmodernPythia@beehaw.org 23 points 11 months ago

Also, there’s not a large, well-funded far-left movement in the US fighting to radicalize people.

[-] Erk@cdda.social 10 points 11 months ago

All I've got to offer are unionisation pamphlets and a brick.

[-] kent_eh@lemmy.ca 17 points 11 months ago

Which I guess means leftists are generally either more capable of spotting misinformation or less inclined to act on it.

Or are less likely to be on Facebook in general.

[-] fmstrat@lemmy.nowsci.com 7 points 11 months ago

The studies were percentage based, so yes, volume of posts could play an active role but likely more from an "activity" amount vs "presence".

[-] kingthrillgore@kbin.social 6 points 11 months ago* (last edited 11 months ago)

So this confirms all the studies and adages of conservative voters being less intelligent, more subject to scams and fraud, and less accepting of social norms.

[-] whelmer@beehaw.org 1 points 11 months ago

You've got studies suggesting that conservatives are less accepting of social norms?

[-] nzodd@beehaw.org 2 points 11 months ago

Does not literally turning traitor and attempting to overthrow the United States of America and murder the vice president count as a social norm?

[-] whelmer@beehaw.org 3 points 11 months ago* (last edited 11 months ago)

Bit of a non-sequitor, that would be an anecdote and not a study. But yeah I would say that those things would violate social norms. I don't know if I would agree that conservative people are more likely to violate those norms, which is presumably your point. Take a look at the history of political assassinations in the United States or in Europe, for example. Political violence does not belong uniquely to conservatives.

I think actually pretty much by definition that conservatives are MORE concerned with social norms. That's kind of one of the primary traits of conservativism. I think a pretty good argument could be made that the Tumpist people you're referring to do not so much represent a conservative point of view as much as a fascist or ultra-nationalist one, which explains why they will violate certain norms pertaining to peaceful electoral processes, while strongly maintaining other norms, like heterosexual nuclear families or religious observances or certain expectations of gender expression, etc.

[-] inconel@lemmy.ca 4 points 11 months ago

If it only drives the far-right, does that mean Facebook contributed shifting in window of discourse? (https://en.m.wikipedia.org/wiki/Overton_window)

[-] fmstrat@lemmy.nowsci.com 2 points 11 months ago

Not radicalization, just polarization, they are different. But overall, yes.

[-] StarServal@kbin.social 12 points 11 months ago

I’ve watched someone I know who only gets their news through Facebook descend into qdom over the last 5 years. Whenever I hear about a new thing conservatives are doing or saying, I can be sure that person will be doing or saying it within a week… which then feeds right back into Facebook for others.

[-] NaoPb@beehaw.org 3 points 11 months ago

Thanks for the confirmation. I bet they don't even notice it happening. Though this could happen to anyone on any side of the spectrum. It's sad that this is what the internet has become.

[-] realslef@fedia.io 29 points 11 months ago

"Yes" really isn't complicated.

[-] FlashMobOfOne@beehaw.org 11 points 11 months ago

Yeah, doesn't seem complicated to me at all. Their algorithm is programmed to keep people angry, engaged, and convinced 100% that their opinion is right. (no matter what that opinion is.)

Keeps people clicking on shitty ads and buying stupid crap.

[-] fmstrat@lemmy.nowsci.com 7 points 11 months ago

Sure it is. Is it Meta's algorithm, is it user reach, is it paid ads, is it channels, is it memes, is it leaning, is it...

Meta is participating in a pretty big study with actual researchers here. I'm no Meta fan, and this is partly for PR I'm sure, but this is a really good thing that more social media companies should do.

[-] realslef@fedia.io 1 points 11 months ago

Those seem like "how" or "why" questions to me. More complicated. The big one is "how do we prevent" and I bet we won't get an honest answer from big social themselves. That's why there should be independent public research.

[-] fmstrat@lemmy.nowsci.com 8 points 11 months ago* (last edited 11 months ago)

The US 2020 Facebook and Instagram Election Study is a joint collaboration between a group of independent external academics from several institutions and Meta

Now we have the first results from this unusual collaboration, detailed in four separate papers—the first round of over a dozen studies stemming from the project.

"We also find that popular proposals to change social media algorithms did not sway political attitudes."

"In other words, pages and groups contribute much more to segregation than users,"

Finally, the vast majority of political news that Meta's third-party fact-checker program rated as false was viewed by conservatives, compared to liberals. That said, those false ratings amounted to a mere 0.2 percent, on average, of the full volume of content on Facebook. And political news in general accounts for just 3 percent of all posts shared on Facebook, so it's not even remotely the most popular type of content.

This last bit is key. This means (up to) 15% of popitical posts werr misinformation (some nonpolitical), mostly viewed by conservatives. They do not state which way this information leans.

[-] storksforlegs@beehaw.org 4 points 11 months ago

Or maybe, we know how they get people, but how do we de-radicalize people on a wider scale?

this post was submitted on 29 Jul 2023
84 points (100.0% liked)

Technology

37213 readers
182 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS