254
submitted 6 months ago by Mubelotix@jlai.lu to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] 0x0@programming.dev 26 points 6 months ago

A traditional outfit

How traditional? How statistically relevant is it? Most Indians i know do not wear turbans at all.

If these stats are trustworthy (and i think they are), the only Indians that wear turbans are Sikhs (1.7%) and Muslims (14.2%). I'd say 15.9% is not statistically significant.

[-] catsarebadpeople@sh.itjust.works 55 points 6 months ago* (last edited 6 months ago)

I think you're looking at it wrong. The prompt is to make an image of someone who is recognizable as Indian. The turban is indicative clothing of that heritage and therefore will cause the subject to be more recognizable as Indian to someone else. The current rate at which Indian people wear turbans isn't necessarily the correct statistic to look at.

What do you picture when you think, a guy from Texas? Are they wearing a hat? What kind? What percentage of Texans actually wear that specific hat that you might be thinking of?

[-] otp@sh.itjust.works 31 points 6 months ago

I think the idea is that it's what makes a person "an Indian" and not something else.

Only a minority of Indians wear turbans, but more Indians than other people wear turbans. So if someone's wearing a turban, then that person is probably Indian.

I'm not saying that's true necessarily (though it may be), but that's how the AI interprets it...or how pictures get tagged.

It's like with Canadians and maple leaves. Most Canadians aren't wearing maple leaf stuff, but I wouldn't be surprised if an AI added maple leaves to an image of "a Canadian".

[-] rimjob_rainer@discuss.tchncs.de 22 points 6 months ago

Imagine a German man from Bavaria... You just thought of a man wearing Lederhosen and holding a beer, didn't you? Would you be surprised if I told you that they usually don't look like that outside of a festival?

[-] mriormro@lemmy.world -2 points 6 months ago

I don't picture real life people as though they were caricatures.

[-] sugar_in_your_tea@sh.itjust.works 9 points 6 months ago* (last edited 6 months ago)

But AI does, because we feed it caricatures.

[-] catsarebadpeople@sh.itjust.works 6 points 6 months ago

Are you literally the second coming of Jesus? Hey everybody! I found a guy who doesn't see race! I can't believe it but he doesn't think anyone is changed in any way by the place that they grew up in or their culture! Everyone is a blank slate to this guy! It's amazing!

[-] mriormro@lemmy.world 0 points 6 months ago

No, I just don't lob groups of people together. It's not that hard to do, everyone's a different person.

[-] admin@lemmy.my-box.dev 3 points 6 months ago

He was imaginary though.

[-] HopeOfTheGunblade@kbin.social 3 points 6 months ago

You don't think nearly 1/6th is statistically significant? What's the lower bound on significance as you see things?

To be clear, it's obviously dumb for their generative system to be overrepresenting turbans like this, although it's likely to be a bias in the inputs rather than something the system came up with itself, I just think that 5% is generally enough to be considered significant and calling three times that not significant confuses me.

[-] 0x0@programming.dev 9 points 6 months ago

You don’t think nearly 1/6th is statistically significant?

For statistics' sake? Yes.

For the LLM bias? No.

[-] FarceOfWill@infosec.pub 9 points 6 months ago

5/6 not wearing them seems more statistically significant

[-] tabular@lemmy.world 4 points 6 months ago* (last edited 6 months ago)

The fact less people of that group actually wear it than do is significant when you want an average sample. When categorizing a collection of images then, naturally, the traditional garments of a group is associated more with that group than any other group: 1/6 is bigger than any other race.

[-] Womble@lemmy.world 1 points 6 months ago* (last edited 6 months ago)

so if there was a country where 1 in 6 people had blue skin you would consider that insignificant because 5 out of 6 didn't?

[-] sugar_in_your_tea@sh.itjust.works 0 points 6 months ago

For a caricature of the population? Yes, that's not what the algorithm should be optimising for.

[-] Duamerthrax@lemmy.world 3 points 6 months ago

What's the data that the model is being fed? What percentage of imaging featuring Indian men are tagged as such? What percentage of imaging featuring men wearing Turbans are tagged as Indian Men? Are there any images featuring Pakistan men wearing Turbans? Even if only a minority of Indian feature Turbans, if that's the only distinction between Indian and Pakistan men in the model data, the model will favor Turbans for Indian Men. That's just a hypothetical explanation.

[-] ramble81@lemm.ee 1 points 6 months ago

Except if they trained it on something that has a large proportion of turban wearers. It is only as good as the data fed to it, so if there was a bias, it’ll show the bias. Yet another reason this really isn’t “AI”

this post was submitted on 07 May 2024
254 points (88.0% liked)

Technology

59456 readers
4036 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS