> This project is probably going to cost millions, they won't catch any pedophiles, they will only scare them so they don't use Apple devices.
Mission accomplished? They can argue that encrypted iPhones aren’t being abused by pedophiles, and they can argue that alternative App Stores will be avenues for illegal material.
Unfortunately, what with all the negative publicity this is causing, Apple now has a huge incentive to catch somebody, anybody, just to justify the project. The thing about pedophiles is, all it really takes is an accusation; the public will presume guilt, the target's job and home and life are taken away, and Apple + NCMEC can say, "See? It works." Even if the target is later exonerated, the damage is done. The teams that vet the candidate images might even have quotas to fill. "How many did you catch last month?" Your innocent baby bath pictures might catch you up in a net that destroys your life, just so some low wage clown can claim they made quota.
End game (being charitable to them here) is they can now start encrypting iCloud photos, and iCloud backups for that matter, with a key that they do not have any way to access, while getting the FBI off their backs on this one hot button issue.
That by itself makes it look ok.
Until you consider… there are a couple counterarguments.
One, Apple has not actually enabled such private encryption with the keys out of reach to Apple for iCloud backups.
Two, that child protection in other countries will sometimes be defined in such repugnant terms that it will compromise Apple fully to scan for the hashes provided by “child protection” organizations in those countries.
“Child protection” is in quotes not because I think countries will get away with shoehorning, say, terrorist
content hashes in as purported child pornography hashes. It’s in quotes because the concept of child protection can be so wildly bizarrely corrupted in some countries for religious or ideological reasons. Who decides what is off limits for children, from having gay parents, to having friends of the opposite sex, to having an unislamic head covering? Well, each random government, of course, with Apple as the enabler, and individual and human rights be damned.
So while the most charitably viewed end game may be good, they seem to be papering over the real impacts this could have.
Big tech companies are under a lot of US govt pressure right now to crack down on CSAM, most especially Apple because of their very low number of reports compared to most other tech giants. I think Apple saw this as a way to ease some of that government pressure while not jeopardizing their ability to use end-to-end encryption, which something like the EARN IT Act could effectively make illegal by requiring a government backdoor for all encrypted cloud services that operate in the US.
Apple probably saw the on-device CSAM scanning as a small, widely-acceptable concession to make that could prevent much bigger crackdowns, but maybe didn't anticipate the level of blowback from people seeing the CSAM scanning itself as an unacceptable government backdoor on their own device.
I, quite simply, don't trust them.