12

[ comments | sourced from HackerNews ]

top 2 comments
sorted by: hot top controversial new old
[-] Endomlik@reddthat.com 2 points 1 year ago

Seems this will be always the case. Small objects are harder to detect than larger objects. Higher contrast objects are easier to detect than lower contrast objects. Even if detection gets 1000x better, these cases will still be true. Do you introduce artificial error to make things fair?

[-] autotldr@lemmings.world 1 points 1 year ago

This is the best summary I could come up with:


As the artificial intelligence revolution ramps up, one trend is clear: Bias in the training of AI systems is resulting in real-world discriminatory practices.

A team of researchers in the UK and China tested how well eight popular pedestrian detectors worked depending on a person's race, gender, and age.

Now they might face severe injury," Jie Zhang, a computer scientist at King's College London and a member of the research team, said in a statement.

This trend is a result of biases already present in the open-source AI systems that many companies use to build the detectors.

The research team called on lawmakers to regulate self-driving car software to prevent bias in their detection systems.

"It is essential for policymakers to enact laws and regulations that safeguard the rights of all individuals and address these concerns appropriately," the study reads.


The original article contains 376 words, the summary contains 140 words. Saved 63%. I'm a bot and I'm open source!

this post was submitted on 28 Aug 2023
12 points (100.0% liked)

TechNews

4136 readers
1 users here now

Aggregated tech news.

founded 1 year ago
MODERATORS