this post was submitted on 22 Feb 2026
692 points (99.4% liked)
Privacy
9029 readers
146 users here now
A community for Lemmy users interested in privacy
Rules:
- Be civil
- No spam posting
- Keep posts on-topic
- No trolling
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
My assumption is that they are recorded locally, then hashed, then the hash is sent to Azure (Microsoft cloud) as Windows Hello leverages some cloud features. Some things in Azure have warnings about taking up to 24 hours to take effect.
Hashing locally and sending the hash to a server is the same way all passwords for online services and systems work, so nothing nefarious there.
There's probably perceptual hashing so they can count 95% similarity as a match without having to check against the source material every time.
I could accept that it has to do with azure propagation delays, but the verbiage was explicit about our computers syncing to the tenant. (Vs. data propagating across it.)
I sort of reject the idea that there’s nothing nefarious going on. The misdirect is weird.
Unless they’re salting the hashed data with information they can’t access, they’re just creating a database of faces and fingerprints.
Sure, maybe if their cryptography is good the DB cannot be reversed but they can still use an unsalted database to give match/no match info on scans of faces and fingerprints submitted to it.
But also, I firmly don’t trust Microsoft. They’ve violated our ELA several times - mostly around applying analytics tools to our data without consulting us first. (Like rolling out MS Viva without telling us.)