Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.
I never supported since it was on device and given this is the US hashes to spot "extremism could be added" given apple doesn't know what the hashes are.
They are not cryptographic hashes. They are "perceptual" hashes or "fuzzy" hashes. They're basically just a low resolution copy of the original image. It's trivial for an attacker to maliciously send innocent seeming images that are a hash collision. This is, by the way, a feature not a bug. Perceptual hashes are not designed to perform a perfect match.
There are plenty of free white-papers on how perceptual hashes work, and Facebook's implementation is even open source.
Apple said they tested 100 million perfectly legal images and three had collisions with a CSAM perceptual hash. When you consider how many photos Apple was proposing to scan (hundreds of trillions of photos) that means thousands of false positives would have occurred even if nobody maliciously abused the system.
And because of all that - Apple was planning to do human reviews of every photo. They would, therefore, have seen every match (and every false positive). It couldn't have been hidden from Apple.
The original proposal was that anything matching a fuzzy hash (which has been shown to be massively flawed - people showed a picture of a dog matched their database) would be reviewed by a human at Apple
Image inference is expensive, and relies upon a lot of human moderation to tweak the models and training data. I can't imagine there are a lot of people who are willing to do that work. We honestly need to get international cooperation going for this withaw enforcement agencies if we want to make an actual impact.
The only reason why Apple's original implementation was controversial was because they were going through people's private data when consumers assumed it was otherwise not able to be accessed. That's both a consumer fault in not understanding who is handling your data, and Apple's blatant lies about how secure their cloud storage is for consumers.
THIS. I don't care/I don't mind companies like Apple or Google using this scanning technology for cloud-based storage services like icloud or google cloud or whatever. But I do not want this creepy surveillance shit on my personal device. Because sure, one day they say that it is used for CSAM. It's not long before it gets expanded to other stuff like climate crisis protesters/activists or LGBTQ+ content (which need I remind you is illegal in some countries!) or other subjects that the government isn't a fan of.
The consumer is not at fault for believing their personal data on their own hardrive, in the phone they paid for, should not be seen by anyone but themselves if they do not choose it to be.
It's not the consumers fault for believing this to be the case given this is how computer technology always worked.
Their only fault is for using Apple, when Apple has gone to extreme lengths to blur the line between what is your and what is theres, and effectively makes it impossible to keep things on your phone only on your phone unless you opt out of iCloud entirely. iCloud is so integrated, it's not clear to the user that everything on the phone is also on the cloud, and therefore not private.