>As currently implemented, iOS will only scan photos to be uploaded to iCloud Photos. If iCloud Photos is not enabled, then Apple isn't scanning the phone.
Except for the "oops, due to an unexpected bug in our code, every image, document, and message on your device was being continuously scanned" mea culpa we will see a few months after this goes live.
After they introduced AirTags I went and disabled item detection on all my devices. Yesterday I went into settings and guess what, that feature is enabled again. On all my devices! I’m not sure when that happened, but my guess would be that latest iOS update caused that…
My point is, you will only have things working to the extent you have testing for, and test coverage is likely less robust for less popular features or where perception tells developers that “no one is going turn that mega-feature off”.
Except for the "oops, due to an unexpected bug in our code, every image, document, and message on your device was being continuously scanned" mea culpa we will see a few months after this goes live.
Only images that match the hashes of the database of CSAM held by the National Center for Missing and Exploited Children (NCMEC) that are uploaded to iCloud Photos are checked.
Based on their technical documents, it's not even possible for anything else to be checked; even if they could, they learn nothing from documents that aren't CSAM.
>>> Only images that match the hashes of the database of CSAM held by the National Center for Missing and Exploited Children (NCMEC) that are uploaded to iCloud Photos are checked.
Incorrect. All files on the phone will be checked against a hash database before being uploaded to iCloud. Any time before, which means all the time, if you have iCloud enabled.
All files on the phone will be checked against a hash database before being uploaded to iCloud.
A cryptographic safety voucher is created for each photo as they're uploaded. The iPhone doesn't know wether or not anything matched. Nothing happens unless the user reaches a threshold of 30 CSAM images that have been uploaded to iCloud Photos.
The phone also doesn't know whether something is about to be uploaded or not. Therefore, all files on the phone will be checked is the relevant portion of that sentence.
Nothing limits Apple to shipping just the hashes provided by NCMEC. And people who worked with NCMEC's database say it contains documents that aren't CSAM.
The potential for this is overblown and I think a lot of people haven’t taken the time to understand how the system is set up. iOS is not scanning for hash matches and phoning home every time it sees one, it’s attaching a cryptographic voucher to every photo uploaded to iCloud that, if some number of photos represent hash matches (the fact of the match is not revealed until the threshold number is reached), then allow Apple to decrypt the photos that match. This is not something where “oops bug got non-iCloud photos too” or “court told us to flip the switch to scan the other photos” would make any sense, some significant modification would be required. Which I agree is a risk especially with governments like China/EU that like to set conditions for market access, just not a very immediate one.
I hear you, 100%, but as a longtime Apple and iCloud user I have been through the absolute ringer when it comes to iCloud and have little faith in it operating correctly.
1. About 3 years ago, there were empty files taking up storage on my iCloud account, and they weren't visible on the website so I couldn't delete them. All it showed was that an excessive amount of storage was taken up. Apple advisors had no idea what was going on and my entire account had to be sent to iCloud engineers to resolve the issue. They never followed up, but it took months for these random ghost files to stop taking up my iCloud storage.
2. Sometimes flipping them on and off a lot causes delays in the OS and then you have to kill the Settings app and re open it to see if they're actually enabled.
Back when ATT was just being implemented, I noticed it was grayed out on my phone every time I was signed in to iCloud, but was completely operational when I was signed out. Many others had this issue and there was literally no setting to change within iCloud; it was a bug that engineering literally acknowledged to me in an email (they acknowledged that it happened for some users and fixed it in a software update).
Screwups happen in an increasingly complex OS and I just feel that there will be a day when this type of bug surfaces in addition to everything that already happens to us end users.
> it’s attaching a cryptographic voucher to every photo uploaded to iCloud that, if some number of photos represent hash matches
OK, but what if Apple silently pushes out an update (or has existing code that gets activated) that "accidentally" sets that number to zero? Or otherwise targets a cryptographic weakness that they know about because they engineered it? That wouldn't require "significant modification".
Funamentally, it's closed software and hardware. You can't and shouldn't trust it. Even if they did do some "significant modification" how are you going to notice/prove it?
> it’s attaching a cryptographic voucher to every photo uploaded to iCloud that, if some number of photos represent hash matches
I see this number very quickly getting set to '1', because the spin in the opposite direction is "What, so you're saying, people get X freebies of CSAM that they can store in iCloud that Apple will never tell anybody about?"
Except for the "oops, due to an unexpected bug in our code, every image, document, and message on your device was being continuously scanned" mea culpa we will see a few months after this goes live.