Preferences

mannerheim parent
I believe the hash comparisons are made on Apple's end. Then the only way to get hashes will be a data breach on Apple's end (unlikely but not impossible) or generating it from known CSAM material.

falcolas
That's not what Apple's plans state. The comparisons are done on phone, and are only escalated to Apple if there are more than N hash matches, at which point they are supposedly reviewed by Apple employees/contractors.

Otherwise, they'd just keep doing it on the material that's actually uploaded.

mannerheim OP
Ah, never mind, you're right:

> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices.

https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

cyanite
He is not right, though. The system used will not reveal matches to the device, only to the server and only if the threshold is reached.
cyanite
> That's not what Apple's plans state. The comparisons are done on phone

Yes but as stated in the technical description, this match is against a blinded table, so the device doesn’t learn if it’s a match or not.

This item has no comments currently.