this post was submitted on 05 Feb 2026
149 points (100.0% liked)

World News

54116 readers
2024 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] AmidFuror@fedia.io 8 points 2 weeks ago (2 children)

If you read the story, you'll see that it was face recognition by humans that was at fault, not automated face recognition. It would be like if the store had a picture posted in the staff room that said "Do not let this person shop here," and the staff had thought this shopper was the guy in error.

[–] Zamboni_Driver@lemmy.ca 11 points 2 weeks ago (1 children)

That's a bit of a stretch to say the system was not at fault. The system pops up an alert and says he this brown guy should not be in your store and shows a picture of a brown guy, staff go out and find a different brown guy and kick him out of the store. It's still the system which is the issue, it scanned faces, sent and alert, but wasn't able to accurately communicate to the staff which specific person they should be worried about. The staff aren't facial recognition experts, the shitty system led to this issue occuring.

[–] prex@aussie.zone 2 points 2 weeks ago* (last edited 2 weeks ago)

Corey Doctrow calls the humans in this loop reverse centaurs..
Edit: not unicorns

[–] NABDad@lemmy.world 11 points 2 weeks ago

"Idiot Store Staff Mistake Someone For Someone Else" doesn't get the same clicks.