this post was submitted on 27 Feb 2025
980 points (96.5% liked)
Technology
63455 readers
4114 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Apple had it report suspected matches, rather than warning locally
It got canceled because the fuzzy hashing algorithms turned out to be so insecure it's unfixable (easy to plant false positives)
They were not “suspected” they had to be matches to actual CSAM.
And after that a reduced quality copy was shown to an actual human, not an AI like in Googles case.
So the false positive would slightly inconvenience a human checker for 15 seconds, not get you Swatted or your account closed
Yeah so here's the next problem - downscaling attacks exists against those algorithms too.
https://scaling-attacks.net/
Also, even if those attacks were prevented they're still going to look through basically your whole album if you trigger the alert
And you’ll again inconvenience a human slightly as they look at a pixelated copy of a picture of a cat or some noise.
No cops are called, no accounts closed
The scaling attack specifically can make a photo sent to you look innocent to you and malicious to the reviewer, see the link above
The official reason they dropped it is because there were security concerns. The more likely reason was the massive outcry that occurs when Apple does these questionable things. Crickets when it's Google.
The feature was re-added as a child safety feature called "Comminication Saftey" that is optional on a child accounts that will automatically block nudity sent to children.