I think it’s actually a good way to look at the problem from a different, broader, perspective that isn’t the average HN user and privacy minded individual standpoint. Also, it interprets Apple’s decisions in the wider framework of their B2C business. Apple’s privacy engineers don’t have the luxury of being radical like their critics when it comes to taking a decision like this.
Given this state of things, have they picked the lesser of two evils to solve the thorny problem of CSAM detection? I think it’s fair to say yes, they did, while still criticizing them for it (which is what they were of course expecting anyway).
Nope. If scanning was implemented outside the device on the iCloud as everyone else, may be. But this is intrusion of privacy on a new "on device surveillance" level and Apple deserves hostile reaction.
No form of apologetic or "technical" explanation can remove this from reality now.
They are betting heavily on their "core" demographics to trust them automatically and without any form of critical thinking.
If this implementation has no effect on Apples bottom line. Things are over.
We will live in badly implemented version of the Minority Report.
If Apple wants to get the same detection ability as server-side, they'll have no choice* but to expand and lock down client-side much more than they publicized. At which point this method is not the lesser evil at all.
* Think about what happens to CSAM uploaded to iCloud before NCMEC tags it. This has to happen for each new CSAM, since NCMEC can't tag what it doesn't see yet.
Surely Apple and NCMEC want to be able to catch these perps (which they easily would have with server-side). Doing it client-side requires expansion of scanning to do much more.
> Given this state of things, have they picked the lesser of two evils to solve the thorny problem of CSAM detection? I think it’s fair to say yes, they did,
First option is not to encrypt data at all (current state, server side does not count), second option is to use end-to-end encryption with hidden backdoor. They found a (third) way, to lock themselves out of most of the data, and for example FBI can’t ask them to show some arbitrary images.
>Apple’s privacy engineers don’t have the luxury of being radical
Not doing anything anti-consumer that the law doesn't force you to do is "radical"? I know you're not an astroturfer, but I had to double check because this is textbook astroturfing tactics.
Apple simply does not have to do this, as far as I'm concerned it's obvious they're either currying political favors or being incompetent. It's perfectly fine if they want to run it on their own unencrypted devices, they absolutely don't have to overstep into their user's devices.