Neither 1 nor 2 are in fact true. Spotlight indexes all kinds of metadata, as does photos search. Adding an agent to upload data from these is easier than extending the CSAM mechanism, and the CSAM mechanism as is is not all that plausible to abuse either technically or socially given how clear Apple’s promises are.
> the CSAM mechanism as is is not all that plausible to abuse either technically or socially given how clear Apple’s promises are.
That’s the problem: Apples promise means nothing exactly because it’s so easy to abuse.
Apple says they will refuse when asked to scan for anything that is not CSAM. That’s one of those nice ‘technically true’ statements lawyers like to include.
Apple will not have to refuse anything, because they won’t be asked.
Apple doesn’t get a huge pile of CSAM images (obviously), they get a list of hashes from a 3rd party. They have no way of ensuring that these hashes are only for CSAM content. And when ‘manually’ checking, they aren’t actually looking at potential CSAM, they are checking the ‘visual derivative’ (whatever that means exactly), basically a manual check if the hashes match.
So yes, they would technically refuse if asked to scan for other material, but they will just accept any list of hashes provided to them by NCMEC (an organization created by the US government and funded by the US DoJ) no questions asked. The US government could include any hash they wish into this list without Apple ever noticing.
> And when ‘manually’ checking, they aren’t actually looking at potential CSAM, they are checking the ‘visual derivative’ (whatever that means exactly), basically a manual check if the hashes match.
The visual derivative is enough to tell the difference between a collection of CSAM and say documents, or pictures of protests.
> The visual derivative is enough to tell the difference between a collection of CSAM and say documents, or pictures of protests.
How would you know ? They have never given details on what this 'visual derivative' actually is. If it's detailed enough to recognise it as CSAM, then Apple isn't allowed to look at it.
Apple very recently promised that “what happens on iPhone stays on iPhone”. At literally the same time, they were building a system that was designed to break that promise. Why should we believe their new promises when their last promise was disingenuous?
Why should Apple's promises be taken at face value? Doubly so in a world where they can be compelled legally to break those promises and say nothing (especially in the more totalitarian countries they willingly operate in and willingly subvert some of their other privacy guarantees)?
What stops Apple from altering the deal further?
And if you have an answer for that, what makes you believe that 10 years in the future, with someone else at the helm, that they won't?
Your device is now proactively acting as a stool pigeon for the government where it wasn't prior. This is a new capacity.
And it's for CSAM. For now. The technological leap from locally scanning just the stuff you upload, to scanning everything, is a very simple change. The leap from scanning for CSAM to scanning for some other perceptual hashes the government wants matched is very simple.
What stops them from altering the deal is that they have said they won’t, and there really is no evidence to believe otherwise.
If they do, then they do. There are never any guarantees about anyone’s future behavior. However just saying ‘what stops you from becoming a thief?’, does not imply that you will become a thief.
There is no new capability other than what they have said. They already had far more general purpose mechanisms for analysing and searching your files already in the operating system.
The leap from doing spotlight indexing to searching for the keywords ‘falun gong’ or ‘proud boys’ in your files is also simple. So is the leap from searching your photos locally for ‘dogs’, to searching locally for ‘swastikas’ and reporting back when they are found.
If they decide to build some spyware, there is no need for it to be based on this. It’s a red herring.
1. Did not exist prior (scanning stuff on the endpoint)
2. Has a plausible abuse case (the same system, applied to stuff that isn't CSAM)
I find this very compelling, especially in the shittier regimes (China...) that Apple operates in.