Lots of people have made policy arguments. No US law requires client side scanning. No US law forbids E2E encryption. US courts don't let law enforcement agencies just demand everything they want from companies. Apple relied on that 5 years ago successfully.[1] And capitulating preemptively is bad strategy usually.
What Neuenschwander said doesn't establish it isn't just an arbitrary limitation.
That’s different. The FBI can legally require Apple or any other US company to search for specific files it has access to on it’s own servers because nothing currently shields backup providers. They could and did force Apple to aid in unlocking iPhones when Apple had that capacity. What they couldn’t do was “These orders would compel Apple to write new software that would let the government bypass these devices' security and unlock the phones.”
Forcing companies to create back doors in their own is legally a very different situation. As to why iCloud is accessible by Apple, the point is to backup a phone someone lost. Forcing people to keep some sort of key fob with a secure private key safe in order to actually have access to their backups simply isn’t tenable.
Apple is a trillion dollar company with a lot of smart people. You could probably get them to design a system of N of M parts for recovery, or an apple branded key holder that you can store in your bank vault and friends houses. If they wanted to they'd do it.
That’s not what the article says, they simply don’t know.
“However, a former Apple employee said it was possible the encryption project was dropped for other reasons, such as concern that more customers would find themselves locked out of their data more often.”
The idea that Apple would fight this is a farce, as they regularly give up customers' data without a fight when the government requests it.
There are laws regarding this, so they don't have a choice. If they get a subpoena from a FISA court, there's not much they can do, but that goes for every US-based company.
Whatever fighting is going on is behind the scenes, so we wouldn't know about it.
None of the laws do yet. My observation isn't about the laws as they necessarily exist now, just as the worry about how this could be abused isn't about Apple's policy as it exists now.
If we trust US courts to stop law enforcement agencies from demanding everything they want from companies, they they can stop law enforcement agencies from demanding Apple add non-CSAM data to the NeuralHash set. If we don't trust the courts to do that, then we're kind of back at square one, right?
I'm not American, but my understanding is that as soon as Government is forcing Apple to search our devices for something, 4th Amendment protections apply. (Unless they hold a search warrant for that specific person, of course.) Is this not correct?
If Apple was performing scans on their cloud servers, you'd be absolutely right. But if the scanning is being done on the individual's device, I'm not sure it's that straightforward. The third party doctrine surely cannot apply if the scanning is performed prior to the material being in third party hands.
Therefore if the Government forces Apple to change the search parameters contained within private devices, I cannot see how this would work around the 4th Amendment.
If this is correct, it might be possible to argue that Apple's approach has (for Americans) constitutional safeguards which do not exist for on-cloud scanning performed by Google or Microsoft.
I read the article; I don't think they highlighted this specific point that on-device scanning has a potential, hypothetical constitutional advantage in comparison to Google, Microsoft and Facebook who scan exclusively in the cloud.
> The 4A protections don't apply to third parties.
The government can't pay someone to break into your house and steal evidence they want without a warrant. I mean, they can, but the evidence wouldn't be admissible in court.
They don't need a warrant. You gave data to someone else. That someone isn't bound to keep it secret. They can demand a warrant if they are motivated by ethical principles but that is optional and potentially overruled by other laws.
But if they're looking for incriminating evidence on your private property (i.e. on-device scanning) then they do need a warrant. It doesn't matter if a copy of it was also given to a third party (i.e. uploaded to iCloud) what matters is where the actual search takes place.
The authorities aren't doing the scanning. You will be made to agree in the fine print to let Apple do it when iCloud sync is enabled. If they run across evidence of a crime then c'est la vie.
The sort of questions about 4A protections here haven't really been tested. Third party doctrine might not apply in this circumstance and the court is slowly evolving with the times.
In Carpenter v. United States (2018), the Supreme Court ruled warrants are needed for gathering cell phone tracking information, remarking that cell phones are almost a “feature of human anatomy”, “when the Government tracks the location of a cell phone it achieves near perfect surveillance, as if it had attached an ankle monitor to the phone’s user”.
...[cell-site location information] provides officers with “an all-encompassing record of the holder’s whereabouts” and “provides an intimate window into a person’s life, revealing not only [an individual’s] particular movements, but through them [their] familial, political, professional, religious, and sexual associations.”
What Neuenschwander said doesn't establish it isn't just an arbitrary limitation.
[1] https://en.wikipedia.org/wiki/FBI–Apple_encryption_dispute
Where the hashes get checked is relevant to the policy problem of what