this post was submitted on 26 Nov 2025
195 points (92.6% liked)

Mildly Interesting

23539 readers
539 users here now

This is for strictly mildly interesting material. If it's too interesting, it doesn't belong. If it's not interesting, it doesn't belong.

This is obviously an objective criteria, so the mods are always right. Or maybe mildly right? Ahh.. what do we know?

Just post some stuff and don't spam.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] cerebralhawks@lemmy.dbzer0.com -4 points 1 day ago (1 children)

Oh yeah, I remember being asked about this! /s

They didn't ask me, and I'm guessing they didn't ask you, either. So, who DID they ask? Was it random, or was it "random"? Was it readers of a particular magazine or website, or consumers of a particular product?

[–] Acamon@lemmy.world 32 points 1 day ago (1 children)

I'm assuming you know how surveys work? If you're genuinely interested in their data sampling methodology, you can easily find it on the website of the company that conducted the survey (who are named on the infographic).

I'm not making any big claims about YouGov and their reliability or freedom from bias, but this isn't just some random unsourced poll, so props to whoever made the infographic for bothering to include a source.

[–] cerebralhawks@lemmy.dbzer0.com -1 points 13 hours ago (1 children)

Yeah, I know how surveys work. They ask a few people and make it sound like they asked everyone. I understand there's a science to it, but if everyone wasn't asked, it's a guess at best.

Let me put it another way: my opinion didn't count enough to ask me. Yours didn't either. So how do they decide who isn't worth asking, and how would the data change if they did ask those people?

[–] Acamon@lemmy.world 3 points 12 hours ago

A survey or poll is different from a vote. You're right that unless we ask every single person in a group we don't know precisely how that entire group would answer. But this irrelevant, being able to establish patterns in smaller sample groups and extended them to larger population is one of the the cornerstone of science and knowledge.

An engineer needs to know how much weight a specific size and shape of lumber can safely take. They can't test the indvidual beam to breaking point and still use it. So they test other similar sized pieces of wood, under similar conditions, and generalise. This can be done well, or done poorly, depending on how well they can isolate confounding effects.

So with a survey, if I just ask 100 people I know, it's would be a decent survey of the beliefs of my social circle, but it would be a poor survey of national beliefs, because my friends are not a balanced representative sample of the wider population. That's why most polling / surveying uses methods to try and achieve a sample that is actually representative. When done well, these ensure the survey respondents correspond to the demographics of a population (gender, education, religion, location, health, etc).

Obviously this approach has its limitations, and can be done poorly, but there's a bunch of research and evidence for what methods help achieve more accurate results. Saying "this poll can't be accurate because they didn't ask me" is like saying "I don't know if the sun will rise tomorrow". You're right, we won't know for sure until we actually see it rise, but we can infer from past events and confidently predict the likely outcome.

If you want to say "this survey isn't accurate because it uses an older demographic model that has been shown to be ineffective at representing contemporary attitudal choices" or "this survey is inaccurate because it only controls for age, race and gender, but didn't account for patterns of social media usage which are highly relevant" that's fine, that's engaging with the methodology. But if the problem is "they didn't ask everyone so it's wrong" it really seems like you don't know how surveys works.