A good point to keep in mind, there is no "scanning".
There is a hashing, then a check of the hashes - not the pictures - and if a hash matches the on-device NCMEC hashes list a "security voucher" is generated. Said voucher will be uploaded to iCloud photos along with the picture it refers to, without anyone in the chain being able to know that it happened.
Do they do device local scanning for this purpose? Apple’s original announcement said that it was to give law enforcement answers about photos synced to iCloud.
If so, then I’ll immediately change my mind and join the protest. Device local scanning is equivalent to someone checking your personal possessions at your home.
> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning
images in the cloud, the system performs on-device matching using a database of known CSAM image
hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database
into an unreadable set of hashes, which is securely stored on users’ devices.
Yes, but my understanding was that they do this for iCloud-synced photos.
If iCloud is off, do they still do this? Your quote actually doesn’t contradict that, which is my main hangup. If you turn on iCloud, you forfeit certain expectations.
I’ll read through it carefully now.
EDIT: It was the very first sentence of the intro:
> CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts.
I don’t get it. It’s their platform. Other image platforms do this matching. Old film shops used to do this matching. Why is this evil?
The sticking point is that the matching happens on-device, not on their servers. Sure, it only happens for photos that will be synced, but it’s still your device doing the matching and snitching.
There’s also the fact that the “only scans local photos marked for upload to iCloud” is a technically thin barrier. A switch that could very easily and quietly be flipped in the future.
If you scan in the cloud, photos not synced to the cloud are 100% not going to be scanned - they are JUST NOT THERE. If you can on device, you are one "if(" condition away from "Accidentally "scanning other photos. See the issue now?
Do you have any general objection to client software scanning content it uploads before it uploads it, rather than afterwards? Scanning it on your phone means that unencrypted versions of your data don't have to leave your phone, which IMHO gives more privacy than any other option.
Do you have an objection even though it means that the content (in a form readable by anyone else) never leaves your device? Would it be preferable if Apple scanned files that were synced to iCloud, but no longer encrypted them in transit/at rest, or encrypted them in a way that they are able to decrypt?
I'm not sure that's true. A lot of people seem to have less of a problem with Apple scanning for CSAM and more of a problem that they're doing it on the phone (and looking for specific images, I guess?).
It’s the fact that the phone is scanning your images and relaying the results to Apple that’s the issue. If this feature were just an extension of existing on-device scanning and stayed on the device, perhaps informing the owner “Hey, it appears a few of your images may be CSAM. Would you like to review them?” then I don’t think people would be very upset.