this post was submitted on 08 Apr 2025
629 points (91.2% liked)
Fuck AI
2468 readers
1217 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's less of a bias of the programmer and moreso a bias of data, particularly when a factor like gender or ethnicity correlates with something without direct causation, such as crime rates correlating with ethnicity largely because of immigrants being poorer on average, and economic standing being a major correlating factor. If your dataset doesn't include that, any AI will just see "oh, people in group x are way more likely to commit crimes". This can be prevented but it's generally more of a risk of overlooking something than intentional data manipulation (not that that isn't possible).