Hacker News new | past | comments | ask | show | jobs | submit login

In a sense, all evidence is the result of the criminal's own stupidity and all crimes are solved based on law enforcement getting lucky. If the criminals were smart, they wouldn't leave any evidence behind.

The issue from law enforcement's perspective is that there is now a class of evidence that was previously accessible to them, but is no longer accessible not because the criminal got smarter, but because Apple said the police shouldn't have access to it.

A warrant is already required to search a cell phone. Is there any reason why a cop shouldn't be allowed to search a phone when they went to court, presented their probable cause to a judge and was subsequently issued a warrant?




Detectives lawfully obtaining a warrant and performing their search within the limitations of that warrant is not a problem. That's what law enforcement is supposed to do.

The problem is that a warrant _isn't_ required to search a cellphone, laptop or any other personal device.

Those devices can be seized and searched at the border, and any data the device sends at any other time can be intercepted by Stingray style devices or as part of a mass data collection program. And law enforcement agencies are pushing for more access, more of the time. I don't think they should have any access, unless they have a valid warrant for a specific crime


> Detectives lawfully obtaining a warrant and performing their search within the limitations of that warrant is not a problem.

An encryption system installed by the manufacturer that the manufacturer itself cannot decrypt makes it a problem. Even if a cop showed all of the probable cause necessary and obtained a warrant from a judge, he would still be unable to search the device - not because the suspect took steps to protect the evidence, but because the device manufacturer did.

> The problem is that a warrant _isn't_ required to search a cellphone, laptop or any other personal device.

A warrant is required to search cell phones. See Riley v. California [1], which went all the way to the Supreme Court. The ruling will likely apply to laptops, etc., as well - the opinion went so far as to refer to cell phones as "minicomputers".

> Those devices can be seized and searched at the border [...]

A split key solution would solve that problem - border patrol/the police couldn't search the device without obtaining a warrant and getting the device manufacturer to decrypt it.

[1] https://epic.org/amicus/cell-phone/riley/

[2] https://supreme.justia.com/cases/federal/us/573/13-132/opini...


> Even if a cop showed all of the probable cause necessary and obtained a warrant from a judge, he would still be unable to search the device

How is that any different to information I store in my head? I can't be compelled to reveal incriminating information that I hold in my head, and I don't see why I should be compelled to reveal the same information if I chose to store it in an encrypted device.

If you decide it's ok for the law to inspect the contents of my encrypted devices, what happens when they get the ability to inspect the contents of my brain? That will happen sooner or later, and if encrypted personal data isn't considered private then I'm confident internal personal data won't be either.

> A warrant is required to search cell phones

Not if I'm crossing the border or near any Stingray style device

> A split key solution would solve that problem

I don't think it would. If I have to give physical access to my device, it's as good as compromised


> How is that any different to information I store in my head?

It's the difference between a 4th Amendment issue and a 5th Amendment issue. Whether or not the police could search your phone used to fall squarely within the bounds of the 4th Amendment. If you encrypted it yourself, it would then be a 5th Amendment issue - you have a right to not self-incriminate.

With the new iPhones, someone else (Apple) decided to encrypt your phone for you in such a way as to prevent any searches, regardless of whether or not there's a warrant involved. In doing so, Apple created a class of evidence that cannot be searched. People here tend to frame that in terms of my phone or my data - why should the police be searching me? Most of us will never have a search warrant issued on us - they exist to collect evidence of crimes and we're generally not criminals. If you step back and and look at it from a law enforcement perspective, do you really want companies that manufacture popular devices suddenly deciding that data on their products cannot be used as evidence in a crime? I'll provide my own reductio ad absurdum in response to your brain-scanning argument and ask how you would feel if the cops told you "Sorry, there's nothing we can do. It looks like your spouse was shot with an iGun."

A split key solution would definitely stop a border guard - the data is still encrypted and cannot be decrypted without cooperation from all n parties that hold the pieces of the key. No, it won't technically stop them from installing a backdoor on your laptop, but I think you're kind of shifting goalposts with that argument. We're talking about warrants to decrypt data here.


> In doing so, Apple created a class of evidence that cannot be searched.

This seems to be the fatal flaw in your argument, because you aren't recognizing the duality inherent in it.

Apple 1) created a class of evidence 2) that cannot be searched. That class of evidence didn't exist in 1776 or 1976 or 1996. The police can solve crimes without it as they've been doing for hundreds of years.

> "Sorry, there's nothing we can do. It looks like your spouse was shot with an iGun."

Sorry, there's nothing we can do... except interview witnesses and suspects, check alibis, investigate the crime scene, autopsy the body, look for motive, review surveillance footage, etc. etc.


I don't think the "police have been solving crimes for centuries" argument is very persuasive. Back in 1996, we didn't have people walking around conducting half of all of their communication through a little box that they always carry with them. What used to entail walking across town and physically talking to someone is often now just a Facebook update or text message. Searching a cell phone now is probably about comparable to searching a home in 1996 in terms of how invasive it is, but we weren't making the argument two decades ago that the police shouldn't be able to get a search warrant for your home because it's too invasive.

To continue following that logic out, neither phones in general nor surveillance existed in 1776. Does that mean the police shouldn't be able to get warrants to read someone's phone records or see surveillance footage because they could still gather evidence just fine before those existed?


> Back in 1996, we didn't have people walking around conducting half of all of their communication through a little box that they always carry with them. What used to entail walking across town and physically talking to someone is often now just a Facebook update or text message. Searching a cell phone now is probably about comparable to searching a home in 1996 in terms of how invasive it is, but we weren't making the argument two decades ago that the police shouldn't be able to get a search warrant for your home because it's too invasive.

Searching a cell phone is much more than what they would get from searching a home. Twenty years ago if a suspect walked across town two weeks before the crime and had a conversation with someone, there would be no automatic record of it even happening, much less the content of the conversation being recorded indefinitely. Even for written correspondence, people rarely keep every letter they've ever received and even less often keep a copy of every letter they've ever sent.

People have no obligation to carry around a tracking device that records everywhere they go and everything they say in a format understandable by the government.

> To continue following that logic out, neither phones in general nor surveillance existed in 1776. Does that mean the police shouldn't be able to get warrants to read someone's phone records or see surveillance footage because they could still gather evidence just fine before those existed?

You keep conflating the question of whether they can get a warrant with the utility of doing so. Encryption has existed longer than the United States. A warrant grants them the ability to look at your stuff, it doesn't imply that they'll be able to understand it, or even that you'll have kept any stuff worth looking at.

For example, shouldn't you have the same objection to Snapchat as you have to encryption? The government's warrant gives them even less if the content no longer exists than if it exists encrypted. But the idea that people should be prohibited from automatically deleting old information is pretty clearly ridiculous.


> Searching a cell phone is much more than what they would get from searching a home.

What I was trying to get at was that we interact with other people in a very different manner than we did two decades ago. I don't see any reason that the means through which police gather evidence shouldn't reflect such a change. Is it your opinion that searching a cell phone is so invasive that we shouldn't allow it with a warrant?

> Encryption has existed longer than the United States. A warrant grants them the ability to look at your stuff, it doesn't imply that they'll be able to understand it, or even that you'll have kept any stuff worth looking at.

Back then, if I wanted to keep my correspondence secure I'd pull out my disappearing ink and Vigenere ciphers and actively go about protecting what I wrote. This isn't a case of people taking steps to protect their data, it's someone else (Apple) stepping in to encrypt their data, and changing the way that they were encrypting it so as to actively prevent cooperation with law enforcement. They were cooperating before with cases involving encrypted cell phones, now they are not. This isn't a decision that a criminal suspect made to protect their data - this is a decision that a tech company made on their behalf. I don't have a problem a problem with people actively encrypting their own data, I have a problem with a tech company making a purely political decision to change their encryption algorithm which has the potential to impact any criminal investigation that involves a new iPhone.

> For example, shouldn't you have the same objection to Snapchat as you have to encryption?

Was Snapchat assisting law enforcement with criminal investigations before?


> What I was trying to get at was that we interact with other people in a very different manner than we did two decades ago. I don't see any reason that the means through which police gather evidence shouldn't reflect such a change. Is it your opinion that searching a cell phone is so invasive that we shouldn't allow it with a warrant?

You're conflating whether they can get a warrant with whether the warrant produces anything again.

> This isn't a decision that a criminal suspect made to protect their data - this is a decision that a tech company made on their behalf.

What significance are you attributing to that distinction? Encryption on devices with the relevant hardware instructions is basically free; the user receives no practical benefit from not using it. What purpose is there in presenting the user with a choice that every rational user will make the same way?

> I don't have a problem a problem with people actively encrypting their own data, I have a problem with a tech company making a purely political decision to change their encryption algorithm which has the potential to impact any criminal investigation that involves a new iPhone.

What does it matter what they were doing before? You seem to be saying that whether Snapchat should be allowed should be based on whether or not the same company had a previous product that stored messages indefinitely. That makes no sense.


I feel like we're just going back and forth with the exact same points - when the warrant used to produce something, but no longer does it poses an issue for law enforcement. In this case, it's not the suspect that made that choice to prevent law enforcement from gaining access.

> Encryption on devices with the relevant hardware instructions is basically free; the user receives no practical benefit from not using it. What purpose is there in presenting the user with a choice that every rational user will make the same way?

It's not free - encrypting data always comes with the risk that you're going to lose the means to decrypt it. I can think of two people I know who in one case put full disk encryption on their cell phone and in the other case put full disk encryption on their laptop. After rebooting a week or two, both had forgotten the passwords they used and lost all of the data on them. Neither of them encrypt their devices anymore.

Encryption can make data recovery a pain in the butt. I've had encrypted e-mails in my mail archives that I've discovered that I no longer have the private key for - it got replaced a year prior, the new key got migrated to a new computer without the old one, and the old computer eventually got wiped and repurposed.

Yeah, it was my mistake for not backing up my old GPG key. I'd put a hefty share of the blame on my two friends as well for their encryption woes as well. But that's the point - we make mistakes, and strong encryption isn't very forgiving. It's a trade-off that not everyone is going to make, especially for data they don't consider sensitive; I've lived with people who have to go through password recovery every two weeks or so. Personally, I wouldn't encrypt something that I'm only going to access once every 5 years or so; I'd use some other solution like making an unencrypted backup on some removable media and storing it in a secure place.


> I feel like we're just going back and forth with the exact same points - when the warrant used to produce something, but no longer does it poses an issue for law enforcement. In this case, it's not the suspect that made that choice to prevent law enforcement from gaining access.

You keep saying these things but I don't see how they're relevant. Are you saying that if the original iPhone had the same encryption as the current one then there would be no issue for law enforcement, but because it didn't now there is? Why is that?

If your logic is to stand, doesn't that mean that all new device manufacturers need to make everything difficult for law enforcement by default or else lose their ability to decide otherwise later? Is that really the incentive you want?


> doesn't that mean that all new device manufacturers need to make everything difficult for law enforcement by default or else lose their ability to decide otherwise later?

I would ask why they're specifically designing their devices to make it impossible for law enforcement to collect evidence. In general, I want to be secure from people who would do me harm, and at the same time I want law enforcement to be able to bring anyone who does do me harm to justice. Preventing the police from lawfully collecting evidence shifts the burden entirely on me to defend myself and allows criminals to act without repercussion. I like to read cyberpunk, but I don't want to live it.

It's my firm opinion that it's possible to design systems that adequately secures a user's data from unauthorized access (to include both criminals and law enforcement acting without warrants) but at the same time allows access by law enforcement under lawful conditions (i.e. a warrant). When a major device manufacturer like Apple claims that the inability of law enforcement to access your data is a feature, I think that sets a very dangerous precedent. I want to make sure my data is protected from criminals; I don't want criminal evidence protected from the cops.

I'll turn the question around - is it right for a major consumer device manufacturer like Apple to decide that the police should not be able to collect evidence from their customers?


What Apple did was remove itself from the equation. Long before Apple, anyone could encrypt data they felt like encrypting, and trust that short of divulging the key, that data could not be decrypted. The only thing that has changed is the ease of use around applying the encryption. I don't agree that making encryption "too easy to use" should be illegal.

Of course with iPhones we are still encrypting the data ourselves. You choose to apply a PIN lock (or not). If you choose to allow a fingerprint to unlock the phone from a cold boot, then the government can collect your fingerprint and decrypt your files (fingerprints are not testimony). If you choose a weak PIN, the government can guess it and decrypt your files.

All Apple has done is choose to design a secure encryption library, one where there is no obvious backdoor, and one where they cannot be co-opted into secretly disclosing your personal data to the government through a 3rd-party warrant. The fact is, Apple is not in possession of your data, and they don't want to be in possession of your data.

If somehow Congress manages to pass CALEA-type laws requiring Apple to maintain a backdoor into our data, we'll just bypass Apple and keep the data safe ourselves. It might take a few more years for the technology to become equally usable, but the 1st amendment guarantees our right to develop and publish and freely license the software necessary to achieve the end goal, namely, that people have the ability to control access to personal data that they themselves collect and maintain.

Thankfully Tim Cook has the experience and unique perspective on these matters to truly understand the value and necessity of being able to keep personal data private. I'm sure the path that brought him to these strongly-held personal beliefs was not easy, but I believe the world actually is a better place because of it. I am also very thankful to live in a country where Tim can help craft a device which upholds his beliefs, and it would be a sad day indeed to see that freedom stifled.

You're argument about "an encryption system installed by the manufacturer that the manufacturer itself cannot decrypt" does not make any sense. It sounds like an argument against functional encryption, which is an argument against functional computers. Please try following your thought to its logical conclusion, and consider if it's really a country you would want to live in?


> No, it won't technically stop them from installing a backdoor on your laptop, but I think you're kind of shifting goalposts with that argument. We're talking about warrants to decrypt data here.

That backdoor can then be used to acquire my encryption key and decrypt my data, so I think it's still entirely relevant. Less relevant but still entirely reasonable is my concern that compromising hardware during a border crossing should be considered even remotely acceptable!

> It's the difference between a 4th Amendment issue and a 5th Amendment issue. Whether or not the police could search your phone used to fall squarely within the bounds of the 4th Amendment. If you encrypted it yourself, it would then be a 5th Amendment issue - you have a right to not self-incriminate.

I don't think the main change between old phones which could be searched under the 4th amendment and new phones which cannot is encryption.

The main change is that an old phone used to be a relatively impersonal tool akin to a car or gun, while a new phone holds an incredible amount of intensely personal information about a person.

Seizing an old phone was no big deal; it would be as if the police wanted to seize the spade in my garage. Go for it. Use it to eliminate me from your investigations. Seizing a new phone, however, gives many of my intimate secrets to LEOs who are probably taking a hostile, confrontational stance towards me. It also gives those people an enormous amount of power over me. They now have access to my email accounts, forum accounts, personal contacts etc and could easily impersonate me or blackmail me. In theory they shouldn't take advantage of that power but I have no doubt that they would do so anyway, in some cases at least.

The reason I mentioned mind-reading (how do I say that without sounding like a kook?) is that it illustrates that point with a bit more impact. At some point it will become possible, and probably even desirable under the right circumstances, but the potential for abuse is enormous and it will need to be governed under far stronger laws than anybody is currently protected by. The 5th amendment might be adequate if it can be used to simply outlaw the practice, but I doubt that will happen. So how do you control what personal data LEOs have access to? And this exact problem exists right now with your smart phone, albeit to a lesser extent.

Another way of putting it would be: LEOs can currently request a warrant for particular searches - phone tap, call records, physical property at a specific address etc. But if they get one that covers "smartphone" (or, later, memories), that basically gives them access to everything, almost all of which will be quite personal and entirely unrelated to the case. So how do you control what they get access to?

Right now, the most secure option is to never record anything personal on any device, but that's harder said than done (and nobody on HN has achieved it!). Another option is to encrypt anything you consider private, and hope that your hardware or encryption scheme hasn't already been broken.


> That backdoor can then be used to acquire my encryption key and decrypt my data, so I think it's still entirely relevant.

Putting a backdoor on a device is utterly ineffective way of gathering evidence. If I was taken into custody and knew that damning evidence was on my phone, I would want to make sure that it never fell into police hands. The cop's backdoor would see me walk up to my car, place my phone carefully in front of the tire, and subsequently drive over it. $400 for a new phone is worth the price to stay out of jail. That said, this is all hypothetical - unless you have some evidence to show that the cops and border patrol are routinely putting malware on phones in order to decrypt the contents?

> Seizing a new phone, however, gives many of my intimate secrets to LEOs

Which is why they should need a warrant to do so. I'm all in favor of technical solutions to prevent them from decrypting your cell phone without one.

> LEOs can currently request a warrant for particular searches - phone tap, call records, physical property at a specific address etc. But if they get one that covers "smartphone" (or, later, memories), that basically gives them access to everything, almost all of which will be quite personal and entirely unrelated to the case.

Look at your examples again - in all of them (not just the cell phone), there is the potential for the police to collect personal information information that is completely unrelated to the case. That's why the police need to go before a judge and show probable cause in order to get a warrant. I don't understand why the folks on HN think it's okay for the cops to go into your home with a warrant and search through your personal effects (including your computer), but a cell phone is something completely different.


I think the answer people would be inclined to give to the last question depends a lot on how you formulate it. How about "Is the possibility that a court might decide that a cop should be allowed to see your data a reason to not allow you to hide it properly"? Should people also be legally compelled to take daily paper notes of all their potentially criminal thoughts because if they are later needed as evidence, it might be impossible to get them out of their heads even with a warrant?


In this case, though, it's not the people who are hiding it - the device manufacturer is. You've always been free to encrypt your data if you so choose. Nobody is buying a new iPhone because of the encryption; they're buying it because it's a newer, better version of an existing product - a product that police could search for evidence before if they had a warrant.

If you want to keep your data away from the police, there's never been anything stopping you. If I'm the victim of a crime, I don't want some third party deciding whether or not the police can collect the evidence.


Yeah, there are good reason why we don't let crime victims organise the investigation or decide the perpetrator's punishment. Crime and law enforcement are not the only consideration in this case; if I'm a journalist or activist, I don't want those who I antagonise as part of my work to be able to access my data by leveraging their power and connections. More generally, if I'm a citizen, I don't want those with more guns, more money and better connections than me to be able to intimidate into silence the journalists and activists I rely on to provide a counterbalance.

Unfortunately, it seems to be much easier for a lot of people to imagine themselves as a victim of some high-profile crime (of the kind that perhaps happens to .01% of the US population every year) yearning for justice than to visualise the full extent of the small and large influence on their everyday civic life. Speaking with some degree of frustration, this reminds me of a cliché of Tea Party followers arguing along the lines of "well, if I became rich, I wouldn't want to pay taxes either" and generally being more eager to defend the interests of an imaginary future version of themselves than their own ones.


You're off by several orders of magnitude on your crime statistics. In 2013, 2.3% of households were victims of violent crime, while 13.1% were victims of property-related crime[1]. Do you have evidence to suggest that courts are handing out search warrants like candy to intimidate journalists? Crime is an everyday problem; harassing journalists and activists is not. A solution that prevents police from investigating a crime just because a cell phone is involved in order to further protect rights that largely weren't being violated to begin with isn't a particularly good solution.

[1] http://www.bjs.gov/index.cfm?ty=pbdetail&iid=5111


Sorry, I didn't mean to imply that the probability that you will become the victim of /any/ crime is that low - rather, I argue that the sort of emotion-arousing crime that almost invariably gets thrown around as an example to argue against ubiquitous encryption (like an abductee being driven around multiple states and raped for weeks) is actually rare.

I would imagine that your typical story of being mugged or having your picket fence demolished (what does it take for a mugging to count as violent crime?) would be far less likely to arouse feelings of "this is so terrible, how can we possibly allow encryption on iPhones if people doing this will get to walk free because of it". If you think that helping some small additional fraction of those 2.3% of households or 13.1% of people a conviction of the perpetrators is a more valuable thing than strong encryption, then it would be more intellectually honest to evoke a typical case in your argument than an extremal one.


I'm not advocating for no encryption on phones. There are schemes that would allow for your data to be encrypted and secure from even the cops except in cases where they have a acquired a warrant. The argument that there is no way to do so is more political than technical. This is a solved problem, cryptographically speaking. I think it's intellectually dishonest to wave away legitimate criminal investigations but prop up harassment of activists. I think it's a dangerous precedent when just months after the Supreme Court strikes a major win for privacy advocates by saying that all cell phone searches require a warrant, Apple turns around around and essentially says that isn't good enough - now people who don't even know what encryption is will be immune from any search, with or without a warrant. When a popular tech company can make a Supreme Court ruling moot, I think there needs to be a bit more discussion on the matter.


> There are schemes that would allow for your data to be encrypted and secure from even the cops except in cases where they have a acquired a warrant. The argument that there is no way to do so is more political than technical. This is a solved problem, cryptographically speaking.

It very much is not. All of the schemes that purport to do so involve a systemic risk that the master key is lost to a hostile foreign government or criminal organization, and they inherently prohibit forward secrecy.


> All of the schemes that purport to do so involve a systemic risk that the master key is lost

Look up 'secret sharing schemes' and 'threshold cryptosystems'. The idea that any scheme allowing law enforcement to decrypt a cell phone must inevitably involve a single master key is a strawman argument.

> and they inherently prohibit forward secrecy.

This is a non-issue for encrypted disks, which is what law enforcement has an issue with.


> Look up 'secret sharing schemes' and 'threshold cryptosystems'. The idea that any scheme allowing law enforcement to decrypt a cell phone must inevitably involve a single master key is a strawman argument.

I'm aware of these things. But splitting a master key into five parts doesn't make it any less of a master key. The vulnerability is not in how many keys you need to open a lock, the vulnerability is in requiring the same keys to open all locks.

> This is a non-issue for encrypted disks, which is what law enforcement has an issue with.

Forward secrecy for encrypted disks is implemented by regularly changing your encryption key and destroying all copies of the old key. An attacker who can copy the encrypted contents of your disk and later compromises your key then won't be able to decrypt the copied data with it, because the current key won't decrypt the old ciphertext.

This inherently doesn't work if the government keeps a key that will decrypt the old ciphertext because the attacker with the old ciphertext can still compromise the government's key(s) to decrypt it.


> The vulnerability is not in how many keys you need to open a lock, the vulnerability is in requiring the same keys to open all locks.

And why would you use the same keys to open all locks? Here's a quick off-the-top of my head solution:

The device manufacturer creates a public/private key pair - maybe they make a new one for each device, or maybe for efficiency they make a new pair for each batch or once a month or whatever they deem acceptable. The point is to change it regularly. The court system creates its own public/private keys, changed every two months or so. The FBI creates their own as well, let's say changed every three months.

When the device is manufactured, the current public keys for the manufacturer, court and FBI all go on the device. When the disk is first encrypted by the user, a key is generated, encrypted by the FBI, manufacturer and court in that order, then stored in a separate location on the disk. Later on when the FBI gets its hands on the phone and wants to decrypt it, they send the encrypted key to the court along with the warrant application; if the court approves, they decrypt it and it gets sent on to the device manufacturer. They look up the serial number of the device and decrypt with the appropriate key, then send it on to the FBI. The FBI finally decrypts using their private key and can subsequently get the initial key used to encrypt the hard drive.

In order to decrypt a device without going through this process, you would have to get physical access to phone and also compromise all three private keys. If you did somehow manage to get all three of the keys, you'd only be able to decrypt devices manufactured within at most a two-month time frame. If that's still not an acceptable level of risk, it can be further limited by increasing the frequency at which keys are replaced, creating multiple keys for each window, adding additional agencies with their own keys into the process, etc.

> An attacker who can copy the encrypted contents of your disk and later compromises your key then won't be able to decrypt the copied data with it

I'm trying to imagine a situation where this is actually an issue... the closest I can come up with is: a cop arrests me, fails to get a warrant, illegally copies the encrypted data off my cell phone and retains the encrypted data anyways. I'm let go and my phone is returned, and I subsequently delete all of the incriminating evidence from my phone (overlooking the fact that destroying evidence is a crime). The police later arrest me for something else, confiscate my phone and this time they do get a warrant to search it. Now they decrypt their old data and discover the files I deleted, none of which is admissible in court because it was illegally acquired.

I personally think it's a little far fetched for your average criminal suspect, but I'll play along and say that it's maybe within the realm of possibility for someone high-value enough. I suppose the simple solution would be to do something like use a file system that keeps some sort of hash of the file structure and last modification time, so that you could prove that the file in question didn't come from the data they were authorized to collect. I think that's probably going too deep into performance trade-offs for something that's unlikely to occur.

If you're that worried about incriminating evidence that was left undiscovered on your phone, the simpler solution is to just get a new phone. I wouldn't trust any device that an adversary had physical control over then handed back to me. In this case, why would someone risk eventual conviction to save $400 for a new phone?


> And why would you use the same keys to open all locks?

Because it's an inherent characteristic of the outcome you're looking for. In order for the government to be able to decrypt any encrypted disk, there would have to be some process the result of which is the ability to decrypt any encrypted disk. Fiddling with the internals doesn't change the nature of it.

> I'm trying to imagine a situation where this is actually an issue... the closest I can come up with is: a cop arrests me, fails to get a warrant, illegally copies the encrypted data off my cell phone and retains the encrypted data anyways.

You're assuming the attacker is US law enforcement. Try this one: A foreign government (e.g. China) takes your encrypted device at a border crossing for long enough to have copied it. You stop using that key forever to make sure they never have a chance to steal it and use it to decrypt their copy of your secrets.

But under your system the foreign government can keep a copy of everyone's device until they sufficiently infiltrate the US government and then decrypt all the years of data they've been collecting, at which point China gets everyone's trade secrets, the list of democracy advocates in their country, etc.


> there would have to be some process the result of which is the ability to decrypt any encrypted disk. Fiddling with the internals doesn't change the nature of it.

Yes, that process is called allowing authorized users to access the data. The owner is an authorized user, and there's a process for him/her to access the data on the phone. A cop with a valid warrant is just as legally authorized to access that data as the user is.

> But under your system the foreign government can keep a copy of everyone's device until they sufficiently infiltrate the US government and then decrypt all the years of data they've been collecting

Then make it more time consuming and cumbersome break the key escrow than it is to just break the user's password. There's all sort of things you can do: airgap all of the keys; move them to a different airgapped system after a couple of years; archive them to an encrypted tape after a few more years; generate multiple keys for each time period, allow the device manufacturer to choose one at random, then store the private keys at different locations. Imagine how happy China will be if they spent 15 years trying to break into the systems at FBI Washington only to discover that the key they were after is stored on an encrypted backup tape at FBI Boston.

At this point I have to stop and ask how much protection are we devoting to the task? Technically, no security system stops a determined adversary - it just slows them down. The idea is to either make the enemy have to expend more time and resources to get their data than they are actually willing to expend, or slow them down long enough to be caught. I used to work for the military in a secure facility a little over a decade ago. We kept our sensitive documents locked in safes. Safes are rated based on how long it would take trained safecracker to break into them. The safes we used were rated at around 45 minutes each - and they were behind a thick vault door rated at about an hour. Why was this acceptable? Because our sensitive documents were all behind that door and split between multiple safes, security walked by the facility every half hour and it took less than two minutes for a large group of armed guards to get there.

While we want to slow down our adversary, at same time an authorized user needs to be able to get to the data when they need it. The use case scenario for the device owner is different for a cop with a warrant. It's acceptable to have a lengthy, somewhat cumbersome process for the police with more security in place, since a cop doesn't need to gain access to a user's phone multiple times a day - they'll generally never have to access the contents, and if they do it's probably only going to be once. An iPhone user isn't going to tolerate getting cryptographically secure sign-offs stating that they are authorized to access the device from multiple secured facilities every time they turn on the device. A cop looking into a serious criminal investigation will.

Now I'd ask who it is that we're designing the system to protect against? How determined and well funded are they? Am I designing a system to protect the user from someone willing to devote a nation state's resources to breaking it, or am I designing a system that will protect a user from data theft by criminals? Apple's existing encryption system won't stop them from getting your data if they really want it - especially when you are physically in China. Walking around with a cell phone in your pocket is itself a huge security vulnerability. If we redesign the iPhone to be a computing device that's indefinitely secure against Chinese intelligence services, it ceases to be a cell phone and instead becomes a standalone computer sitting under armed guard in the basement of the Pentagon.

If you're travelling through China with sensitive data that the government wants, perhaps you should reevaluate storing it on your cell phone.


> In this case, though, it's not the people who are hiding it - the device manufacturer is.

That is not correct. What Apple has done is give control over whether the information is hidden or not to the person with the password for the device.

> Nobody is buying a new iPhone because of the encryption; they're buying it because it's a newer, better version of an existing product - a product that police could search for evidence before if they had a warrant.

You can't win this by making a relativistic argument. The world was not in a state of anarchy before everyone started carrying around iPhones.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: