I always found examples such as the one with the trucker, which keep getting brought up in this context, extremely disingenuous. The same sort of argument could easily be used for any tactic that ever resulted in anyone being convicted for something they most likely did - in a hypothetical alternative scenario, they might just as well have been extolling the benefits of torture by saying how after having a few of his fingernails pulled out, the scumbag trucker was more than happy to confess to his crimes.
Nobody is doubting that giving the prosecutors more powers to produce incriminating material will produce more convictions. Emotionally charged anecdotes like this just serve to distract from the discussion about proportionality and possible abuses that we should be having, while providing no relevant information.
I think it's also important to point out that this sort of evidence didn't exist until very recently. Before smartphones with video capability existed, the FBI wasn't insisting that technology companies develop them and that consumers adopt them. Now that we have them they're insisting that we give them unencrypted access to incriminate ourselves.
Nonetheless, the American Founding Fathers recognized the core problem and addressed it in a clear comprehensive manner: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." In no way could that be construed as permitting federal requirement that personal communications & records be arbitrarily accessible by law enforcement agents on a "fishing expedition".
No, but it could be construed as permitting federal requirement that personal communications & records be accessible by law enforcement agents who have been issued a warrant based upon probable cause and describing the place to be searched and the persons or things to be seized.
Disagree. They understood encryption, and enumerated rights to free speech and not testifying against oneself. Warrants empower government agents to acquire information, but not compel the accused to explain it.
another disingenuous thing about that example is that law enforcement, really didn't do anything special. they got lucky, by the criminal's own stupidity. in other words, law enforcement really didn't do much in terms of law enforcing, by utilizing power that were available to them and will now be stripped away in an all encrypted world.
in the new world, the criminal could just as likely do something stupid that will cause the data to be unencrypted and for them to find the damning evidence.
In a sense, all evidence is the result of the criminal's own stupidity and all crimes are solved based on law enforcement getting lucky. If the criminals were smart, they wouldn't leave any evidence behind.
The issue from law enforcement's perspective is that there is now a class of evidence that was previously accessible to them, but is no longer accessible not because the criminal got smarter, but because Apple said the police shouldn't have access to it.
A warrant is already required to search a cell phone. Is there any reason why a cop shouldn't be allowed to search a phone when they went to court, presented their probable cause to a judge and was subsequently issued a warrant?
Detectives lawfully obtaining a warrant and performing their search within the limitations of that warrant is not a problem. That's what law enforcement is supposed to do.
The problem is that a warrant _isn't_ required to search a cellphone, laptop or any other personal device.
Those devices can be seized and searched at the border, and any data the device sends at any other time can be intercepted by Stingray style devices or as part of a mass data collection program. And law enforcement agencies are pushing for more access, more of the time. I don't think they should have any access, unless they have a valid warrant for a specific crime
> Detectives lawfully obtaining a warrant and performing their search within the limitations of that warrant is not a problem.
An encryption system installed by the manufacturer that the manufacturer itself cannot decrypt makes it a problem. Even if a cop showed all of the probable cause necessary and obtained a warrant from a judge, he would still be unable to search the device - not because the suspect took steps to protect the evidence, but because the device manufacturer did.
> The problem is that a warrant _isn't_ required to search a cellphone, laptop or any other personal device.
A warrant is required to search cell phones. See Riley v. California [1], which went all the way to the Supreme Court. The ruling will likely apply to laptops, etc., as well - the opinion went so far as to refer to cell phones as "minicomputers".
> Those devices can be seized and searched at the border [...]
A split key solution would solve that problem - border patrol/the police couldn't search the device without obtaining a warrant and getting the device manufacturer to decrypt it.
> Even if a cop showed all of the probable cause necessary and obtained a warrant from a judge, he would still be unable to search the device
How is that any different to information I store in my head? I can't be compelled to reveal incriminating information that I hold in my head, and I don't see why I should be compelled to reveal the same information if I chose to store it in an encrypted device.
If you decide it's ok for the law to inspect the contents of my encrypted devices, what happens when they get the ability to inspect the contents of my brain? That will happen sooner or later, and if encrypted personal data isn't considered private then I'm confident internal personal data won't be either.
> A warrant is required to search cell phones
Not if I'm crossing the border or near any Stingray style device
> A split key solution would solve that problem
I don't think it would. If I have to give physical access to my device, it's as good as compromised
> How is that any different to information I store in my head?
It's the difference between a 4th Amendment issue and a 5th Amendment issue. Whether or not the police could search your phone used to fall squarely within the bounds of the 4th Amendment. If you encrypted it yourself, it would then be a 5th Amendment issue - you have a right to not self-incriminate.
With the new iPhones, someone else (Apple) decided to encrypt your phone for you in such a way as to prevent any searches, regardless of whether or not there's a warrant involved. In doing so, Apple created a class of evidence that cannot be searched. People here tend to frame that in terms of my phone or my data - why should the police be searching me? Most of us will never have a search warrant issued on us - they exist to collect evidence of crimes and we're generally not criminals. If you step back and and look at it from a law enforcement perspective, do you really want companies that manufacture popular devices suddenly deciding that data on their products cannot be used as evidence in a crime? I'll provide my own reductio ad absurdum in response to your brain-scanning argument and ask how you would feel if the cops told you "Sorry, there's nothing we can do. It looks like your spouse was shot with an iGun."
A split key solution would definitely stop a border guard - the data is still encrypted and cannot be decrypted without cooperation from all n parties that hold the pieces of the key. No, it won't technically stop them from installing a backdoor on your laptop, but I think you're kind of shifting goalposts with that argument. We're talking about warrants to decrypt data here.
> In doing so, Apple created a class of evidence that cannot be searched.
This seems to be the fatal flaw in your argument, because you aren't recognizing the duality inherent in it.
Apple 1) created a class of evidence 2) that cannot be searched. That class of evidence didn't exist in 1776 or 1976 or 1996. The police can solve crimes without it as they've been doing for hundreds of years.
> "Sorry, there's nothing we can do. It looks like your spouse was shot with an iGun."
Sorry, there's nothing we can do... except interview witnesses and suspects, check alibis, investigate the crime scene, autopsy the body, look for motive, review surveillance footage, etc. etc.
I don't think the "police have been solving crimes for centuries" argument is very persuasive. Back in 1996, we didn't have people walking around conducting half of all of their communication through a little box that they always carry with them. What used to entail walking across town and physically talking to someone is often now just a Facebook update or text message. Searching a cell phone now
is probably about comparable to searching a home in 1996 in terms of how invasive it is, but we weren't making the argument two decades ago that the police shouldn't be able to get a search warrant for your home because it's too invasive.
To continue following that logic out, neither phones in general nor surveillance existed in 1776. Does that mean the police shouldn't be able to get warrants to read someone's phone records or see surveillance footage because they could still gather evidence just fine before those existed?
> Back in 1996, we didn't have people walking around conducting half of all of their communication through a little box that they always carry with them. What used to entail walking across town and physically talking to someone is often now just a Facebook update or text message. Searching a cell phone now is probably about comparable to searching a home in 1996 in terms of how invasive it is, but we weren't making the argument two decades ago that the police shouldn't be able to get a search warrant for your home because it's too invasive.
Searching a cell phone is much more than what they would get from searching a home. Twenty years ago if a suspect walked across town two weeks before the crime and had a conversation with someone, there would be no automatic record of it even happening, much less the content of the conversation being recorded indefinitely. Even for written correspondence, people rarely keep every letter they've ever received and even less often keep a copy of every letter they've ever sent.
People have no obligation to carry around a tracking device that records everywhere they go and everything they say in a format understandable by the government.
> To continue following that logic out, neither phones in general nor surveillance existed in 1776. Does that mean the police shouldn't be able to get warrants to read someone's phone records or see surveillance footage because they could still gather evidence just fine before those existed?
You keep conflating the question of whether they can get a warrant with the utility of doing so. Encryption has existed longer than the United States. A warrant grants them the ability to look at your stuff, it doesn't imply that they'll be able to understand it, or even that you'll have kept any stuff worth looking at.
For example, shouldn't you have the same objection to Snapchat as you have to encryption? The government's warrant gives them even less if the content no longer exists than if it exists encrypted. But the idea that people should be prohibited from automatically deleting old information is pretty clearly ridiculous.
> Searching a cell phone is much more than what they would get from searching a home.
What I was trying to get at was that we interact with other people in a very different manner than we did two decades ago. I don't see any reason that the means through which police gather evidence shouldn't reflect such a change. Is it your opinion that searching a cell phone is so invasive that we shouldn't allow it with a warrant?
> Encryption has existed longer than the United States. A warrant grants them the ability to look at your stuff, it doesn't imply that they'll be able to understand it, or even that you'll have kept any stuff worth looking at.
Back then, if I wanted to keep my correspondence secure I'd pull out my disappearing ink and Vigenere ciphers and actively go about protecting what I wrote. This isn't a case of people taking steps to protect their data, it's someone else (Apple) stepping in to encrypt their data, and changing the way that they were encrypting it so as to actively prevent cooperation with law enforcement. They were cooperating before with cases involving encrypted cell phones, now they are not. This isn't a decision that a criminal suspect made to protect their data - this is a decision that a tech company made on their behalf. I don't have a problem a problem with people actively encrypting their own data, I have a problem with a tech company making a purely political decision to change their encryption algorithm which has the potential to impact any criminal investigation that involves a new iPhone.
> For example, shouldn't you have the same objection to Snapchat as you have to encryption?
Was Snapchat assisting law enforcement with criminal investigations before?
> What I was trying to get at was that we interact with other people in a very different manner than we did two decades ago. I don't see any reason that the means through which police gather evidence shouldn't reflect such a change. Is it your opinion that searching a cell phone is so invasive that we shouldn't allow it with a warrant?
You're conflating whether they can get a warrant with whether the warrant produces anything again.
> This isn't a decision that a criminal suspect made to protect their data - this is a decision that a tech company made on their behalf.
What significance are you attributing to that distinction? Encryption on devices with the relevant hardware instructions is basically free; the user receives no practical benefit from not using it. What purpose is there in presenting the user with a choice that every rational user will make the same way?
> I don't have a problem a problem with people actively encrypting their own data, I have a problem with a tech company making a purely political decision to change their encryption algorithm which has the potential to impact any criminal investigation that involves a new iPhone.
What does it matter what they were doing before? You seem to be saying that whether Snapchat should be allowed should be based on whether or not the same company had a previous product that stored messages indefinitely. That makes no sense.
I feel like we're just going back and forth with the exact same points - when the warrant used to produce something, but no longer does it poses an issue for law enforcement. In this case, it's not the suspect that made that choice to prevent law enforcement from gaining access.
> Encryption on devices with the relevant hardware instructions is basically free; the user receives no practical benefit from not using it. What purpose is there in presenting the user with a choice that every rational user will make the same way?
It's not free - encrypting data always comes with the risk that you're going to lose the means to decrypt it. I can think of two people I know who in one case put full disk encryption on their cell phone and in the other case put full disk encryption on their laptop. After rebooting a week or two, both had forgotten the passwords they used and lost all of the data on them. Neither of them encrypt their devices anymore.
Encryption can make data recovery a pain in the butt. I've had encrypted e-mails in my mail archives that I've discovered that I no longer have the private key for - it got replaced a year prior, the new key got migrated to a new computer without the old one, and the old computer eventually got wiped and repurposed.
Yeah, it was my mistake for not backing up my old GPG key. I'd put a hefty share of the blame on my two friends as well for their encryption woes as well. But that's the point - we make mistakes, and strong encryption isn't very forgiving. It's a trade-off that not everyone is going to make, especially for data they don't consider sensitive; I've lived with people who have to go through password recovery every two weeks or so. Personally, I wouldn't encrypt something that I'm only going to access once every 5 years or so; I'd use some other solution like making an unencrypted backup on some removable media and storing it in a secure place.
> I feel like we're just going back and forth with the exact same points - when the warrant used to produce something, but no longer does it poses an issue for law enforcement. In this case, it's not the suspect that made that choice to prevent law enforcement from gaining access.
You keep saying these things but I don't see how they're relevant. Are you saying that if the original iPhone had the same encryption as the current one then there would be no issue for law enforcement, but because it didn't now there is? Why is that?
If your logic is to stand, doesn't that mean that all new device manufacturers need to make everything difficult for law enforcement by default or else lose their ability to decide otherwise later? Is that really the incentive you want?
> doesn't that mean that all new device manufacturers need to make everything difficult for law enforcement by default or else lose their ability to decide otherwise later?
I would ask why they're specifically designing their devices to make it impossible for law enforcement to collect evidence. In general, I want to be secure from people who would do me harm, and at the same time I want law enforcement to be able to bring anyone who does do me harm to justice. Preventing the police from lawfully collecting evidence shifts the burden entirely on me to defend myself and allows criminals to act without repercussion. I like to read cyberpunk, but I don't want to live it.
It's my firm opinion that it's possible to design systems that adequately secures a user's data from unauthorized access (to include both criminals and law enforcement acting without warrants) but at the same time allows access by law enforcement under lawful conditions (i.e. a warrant). When a major device manufacturer like Apple claims that the inability of law enforcement to access your data is a feature, I think that sets a very dangerous precedent. I want to make sure my data is protected from criminals; I don't want criminal evidence protected from the cops.
I'll turn the question around - is it right for a major consumer device manufacturer like Apple to decide that the police should not be able to collect evidence from their customers?
What Apple did was remove itself from the equation. Long before Apple, anyone could encrypt data they felt like encrypting, and trust that short of divulging the key, that data could not be decrypted. The only thing that has changed is the ease of use around applying the encryption. I don't agree that making encryption "too easy to use" should be illegal.
Of course with iPhones we are still encrypting the data ourselves. You choose to apply a PIN lock (or not). If you choose to allow a fingerprint to unlock the phone from a cold boot, then the government can collect your fingerprint and decrypt your files (fingerprints are not testimony). If you choose a weak PIN, the government can guess it and decrypt your files.
All Apple has done is choose to design a secure encryption library, one where there is no obvious backdoor, and one where they cannot be co-opted into secretly disclosing your personal data to the government through a 3rd-party warrant. The fact is, Apple is not in possession of your data, and they don't want to be in possession of your data.
If somehow Congress manages to pass CALEA-type laws requiring Apple to maintain a backdoor into our data, we'll just bypass Apple and keep the data safe ourselves. It might take a few more years for the technology to become equally usable, but the 1st amendment guarantees our right to develop and publish and freely license the software necessary to achieve the end goal, namely, that people have the ability to control access to personal data that they themselves collect and maintain.
Thankfully Tim Cook has the experience and unique perspective on these matters to truly understand the value and necessity of being able to keep personal data private. I'm sure the path that brought him to these strongly-held personal beliefs was not easy, but I believe the world actually is a better place because of it. I am also very thankful to live in a country where Tim can help craft a device which upholds his beliefs, and it would be a sad day indeed to see that freedom stifled.
You're argument about "an encryption system installed by the manufacturer that the manufacturer itself cannot decrypt" does not make any sense. It sounds like an argument against functional encryption, which is an argument against functional computers. Please try following your thought to its logical conclusion, and consider if it's really a country you would want to live in?
> No, it won't technically stop them from installing a backdoor on your laptop, but I think you're kind of shifting goalposts with that argument. We're talking about warrants to decrypt data here.
That backdoor can then be used to acquire my encryption key and decrypt my data, so I think it's still entirely relevant. Less relevant but still entirely reasonable is my concern that compromising hardware during a border crossing should be considered even remotely acceptable!
> It's the difference between a 4th Amendment issue and a 5th Amendment issue. Whether or not the police could search your phone used to fall squarely within the bounds of the 4th Amendment. If you encrypted it yourself, it would then be a 5th Amendment issue - you have a right to not self-incriminate.
I don't think the main change between old phones which could be searched under the 4th amendment and new phones which cannot is encryption.
The main change is that an old phone used to be a relatively impersonal tool akin to a car or gun, while a new phone holds an incredible amount of intensely personal information about a person.
Seizing an old phone was no big deal; it would be as if the police wanted to seize the spade in my garage. Go for it. Use it to eliminate me from your investigations. Seizing a new phone, however, gives many of my intimate secrets to LEOs who are probably taking a hostile, confrontational stance towards me. It also gives those people an enormous amount of power over me. They now have access to my email accounts, forum accounts, personal contacts etc and could easily impersonate me or blackmail me. In theory they shouldn't take advantage of that power but I have no doubt that they would do so anyway, in some cases at least.
The reason I mentioned mind-reading (how do I say that without sounding like a kook?) is that it illustrates that point with a bit more impact. At some point it will become possible, and probably even desirable under the right circumstances, but the potential for abuse is enormous and it will need to be governed under far stronger laws than anybody is currently protected by. The 5th amendment might be adequate if it can be used to simply outlaw the practice, but I doubt that will happen. So how do you control what personal data LEOs have access to? And this exact problem exists right now with your smart phone, albeit to a lesser extent.
Another way of putting it would be: LEOs can currently request a warrant for particular searches - phone tap, call records, physical property at a specific address etc. But if they get one that covers "smartphone" (or, later, memories), that basically gives them access to everything, almost all of which will be quite personal and entirely unrelated to the case. So how do you control what they get access to?
Right now, the most secure option is to never record anything personal on any device, but that's harder said than done (and nobody on HN has achieved it!). Another option is to encrypt anything you consider private, and hope that your hardware or encryption scheme hasn't already been broken.
> That backdoor can then be used to acquire my encryption key and decrypt my data, so I think it's still entirely relevant.
Putting a backdoor on a device is utterly ineffective way of gathering evidence. If I was taken into custody and knew that damning evidence was on my phone, I would want to make sure that it never fell into police hands. The cop's backdoor would see me walk up to my car, place my phone carefully in front of the tire, and subsequently drive over it. $400 for a new phone is worth the price to stay out of jail. That said, this is all hypothetical - unless you have some evidence to show that the cops and border patrol are routinely putting malware on phones in order to decrypt the contents?
> Seizing a new phone, however, gives many of my intimate secrets to LEOs
Which is why they should need a warrant to do so. I'm all in favor of technical solutions to prevent them from decrypting your cell phone without one.
> LEOs can currently request a warrant for particular searches - phone tap, call records, physical property at a specific address etc. But if they get one that covers "smartphone" (or, later, memories), that basically gives them access to everything, almost all of which will be quite personal and entirely unrelated to the case.
Look at your examples again - in all of them (not just the cell phone), there is the potential for the police to collect personal information information that is completely unrelated to the case. That's why the police need to go before a judge and show probable cause in order to get a warrant. I don't understand why the folks on HN think it's okay for the cops to go into your home with a warrant and search through your personal effects (including your computer), but a cell phone is something completely different.
I think the answer people would be inclined to give to the last question depends a lot on how you formulate it. How about "Is the possibility that a court might decide that a cop should be allowed to see your data a reason to not allow you to hide it properly"? Should people also be legally compelled to take daily paper notes of all their potentially criminal thoughts because if they are later needed as evidence, it might be impossible to get them out of their heads even with a warrant?
In this case, though, it's not the people who are hiding it - the device manufacturer is. You've always been free to encrypt your data if you so choose. Nobody is buying a new iPhone because of the encryption; they're buying it because it's a newer, better version of an existing product - a product that police could search for evidence before if they had a warrant.
If you want to keep your data away from the police, there's never been anything stopping you. If I'm the victim of a crime, I don't want some third party deciding whether or not the police can collect the evidence.
Yeah, there are good reason why we don't let crime victims organise the investigation or decide the perpetrator's punishment. Crime and law enforcement are not the only consideration in this case; if I'm a journalist or activist, I don't want those who I antagonise as part of my work to be able to access my data by leveraging their power and connections. More generally, if I'm a citizen, I don't want those with more guns, more money and better connections than me to be able to intimidate into silence the journalists and activists I rely on to provide a counterbalance.
Unfortunately, it seems to be much easier for a lot of people to imagine themselves as a victim of some high-profile crime (of the kind that perhaps happens to .01% of the US population every year) yearning for justice than to visualise the full extent of the small and large influence on their everyday civic life.
Speaking with some degree of frustration, this reminds me of a cliché of Tea Party followers arguing along the lines of "well, if I became rich, I wouldn't want to pay taxes either" and generally being more eager to defend the interests of an imaginary future version of themselves than their own ones.
You're off by several orders of magnitude on your crime statistics. In 2013, 2.3% of households were victims of violent crime, while 13.1% were victims of property-related crime[1]. Do you have evidence to suggest that courts are handing out search warrants like candy to intimidate journalists? Crime is an everyday problem; harassing journalists and activists is not. A solution that prevents police from investigating a crime just because a cell phone is involved in order to further protect rights that largely weren't being violated to begin with isn't a particularly good solution.
Sorry, I didn't mean to imply that the probability that you will become the victim of /any/ crime is that low - rather, I argue that the sort of emotion-arousing crime that almost invariably gets thrown around as an example to argue against ubiquitous encryption (like an abductee being driven around multiple states and raped for weeks) is actually rare.
I would imagine that your typical story of being mugged or having your picket fence demolished (what does it take for a mugging to count as violent crime?) would be far less likely to arouse feelings of "this is so terrible, how can we possibly allow encryption on iPhones if people doing this will get to walk free because of it". If you think that helping some small additional fraction of those 2.3% of households or 13.1% of people a conviction of the perpetrators is a more valuable thing than strong encryption, then it would be more intellectually honest to evoke a typical case in your argument than an extremal one.
I'm not advocating for no encryption on phones. There are schemes that would allow for your data to be encrypted and secure from even the cops except in cases where they have a acquired a warrant. The argument that there is no way to do so is more political than technical. This is a solved problem, cryptographically speaking. I think it's intellectually dishonest to wave away legitimate criminal investigations but prop up harassment of activists. I think it's a dangerous precedent when just months after the Supreme Court strikes a major win for privacy advocates by saying that all cell phone searches require a warrant, Apple turns around around and essentially says that isn't good enough - now people who don't even know what encryption is will be immune from any search, with or without a warrant. When a popular tech company can make a Supreme Court ruling moot, I think there needs to be a bit more discussion on the matter.
> There are schemes that would allow for your data to be encrypted and secure from even the cops except in cases where they have a acquired a warrant. The argument that there is no way to do so is more political than technical. This is a solved problem, cryptographically speaking.
It very much is not. All of the schemes that purport to do so involve a systemic risk that the master key is lost to a hostile foreign government or criminal organization, and they inherently prohibit forward secrecy.
> All of the schemes that purport to do so involve a systemic risk that the master key is lost
Look up 'secret sharing schemes' and 'threshold cryptosystems'. The idea that any scheme allowing law enforcement to decrypt a cell phone must inevitably involve a single master key is a strawman argument.
> and they inherently prohibit forward secrecy.
This is a non-issue for encrypted disks, which is what law enforcement has an issue with.
> Look up 'secret sharing schemes' and 'threshold cryptosystems'. The idea that any scheme allowing law enforcement to decrypt a cell phone must inevitably involve a single master key is a strawman argument.
I'm aware of these things. But splitting a master key into five parts doesn't make it any less of a master key. The vulnerability is not in how many keys you need to open a lock, the vulnerability is in requiring the same keys to open all locks.
> This is a non-issue for encrypted disks, which is what law enforcement has an issue with.
Forward secrecy for encrypted disks is implemented by regularly changing your encryption key and destroying all copies of the old key. An attacker who can copy the encrypted contents of your disk and later compromises your key then won't be able to decrypt the copied data with it, because the current key won't decrypt the old ciphertext.
This inherently doesn't work if the government keeps a key that will decrypt the old ciphertext because the attacker with the old ciphertext can still compromise the government's key(s) to decrypt it.
> The vulnerability is not in how many keys you need to open a lock, the vulnerability is in requiring the same keys to open all locks.
And why would you use the same keys to open all locks? Here's a quick off-the-top of my head solution:
The device manufacturer creates a public/private key pair - maybe they make a new one for each device, or maybe for efficiency they make a new pair for each batch or once a month or whatever they deem acceptable. The point is to change it regularly. The court system creates its own public/private keys, changed every two months or so. The FBI creates their own as well, let's say changed every three months.
When the device is manufactured, the current public keys for the manufacturer, court and FBI all go on the device. When the disk is first encrypted by the user, a key is generated, encrypted by the FBI, manufacturer and court in that order, then stored in a separate location on the disk. Later on when the FBI gets its hands on the phone and wants to decrypt it, they send the encrypted key to the court along with the warrant application; if the court approves, they decrypt it and it gets sent on to the device manufacturer. They look up the serial number of the device and decrypt with the appropriate key, then send it on to the FBI. The FBI finally decrypts using their private key and can subsequently get the initial key used to encrypt the hard drive.
In order to decrypt a device without going through this process, you would have to get physical access to phone and also compromise all three private keys. If you did somehow manage to get all three of the keys, you'd only be able to decrypt devices manufactured within at most a two-month time frame. If that's still not an acceptable level of risk, it can be further limited by increasing the frequency at which keys are replaced, creating multiple keys for each window, adding additional agencies with their own keys into the process, etc.
> An attacker who can copy the encrypted contents of your disk and later compromises your key then won't be able to decrypt the copied data with it
I'm trying to imagine a situation where this is actually an issue... the closest I can come up with is: a cop arrests me, fails to get a warrant, illegally copies the encrypted data off my cell phone and retains the encrypted data anyways. I'm let go and my phone is returned, and I subsequently delete all of the incriminating evidence from my phone (overlooking the fact that destroying evidence is a crime). The police later arrest me for something else, confiscate my phone and this time they do get a warrant to search it. Now they decrypt their old data and discover the files I deleted, none of which is admissible in court because it was illegally acquired.
I personally think it's a little far fetched for your average criminal suspect, but I'll play along and say that it's maybe within the realm of possibility for someone high-value enough. I suppose the simple solution would be to do something like use a file system that keeps some sort of hash of the file structure and last modification time, so that you could prove that the file in question didn't come from the data they were authorized to collect. I think that's probably going too deep into performance trade-offs for something that's unlikely to occur.
If you're that worried about incriminating evidence that was left undiscovered on your phone, the simpler solution is to just get a new phone. I wouldn't trust any device that an adversary had physical control over then handed back to me. In this case, why would someone risk eventual conviction to save $400 for a new phone?
> And why would you use the same keys to open all locks?
Because it's an inherent characteristic of the outcome you're looking for. In order for the government to be able to decrypt any encrypted disk, there would have to be some process the result of which is the ability to decrypt any encrypted disk. Fiddling with the internals doesn't change the nature of it.
> I'm trying to imagine a situation where this is actually an issue... the closest I can come up with is: a cop arrests me, fails to get a warrant, illegally copies the encrypted data off my cell phone and retains the encrypted data anyways.
You're assuming the attacker is US law enforcement. Try this one: A foreign government (e.g. China) takes your encrypted device at a border crossing for long enough to have copied it. You stop using that key forever to make sure they never have a chance to steal it and use it to decrypt their copy of your secrets.
But under your system the foreign government can keep a copy of everyone's device until they sufficiently infiltrate the US government and then decrypt all the years of data they've been collecting, at which point China gets everyone's trade secrets, the list of democracy advocates in their country, etc.
> there would have to be some process the result of which is the ability to decrypt any encrypted disk. Fiddling with the internals doesn't change the nature of it.
Yes, that process is called allowing authorized users to access the data. The owner is an authorized user, and there's a process for him/her to access the data on the phone. A cop with a valid warrant is just as legally authorized to access that data as the user is.
> But under your system the foreign government can keep a copy of everyone's device until they sufficiently infiltrate the US government and then decrypt all the years of data they've been collecting
Then make it more time consuming and cumbersome break the key escrow than it is to just break the user's password. There's all sort of things you can do: airgap all of the keys; move them to a different airgapped system after a couple of years; archive them to an encrypted tape after a few more years; generate multiple keys for each time period, allow the device manufacturer to choose one at random, then store the private keys at different locations. Imagine how happy China will be if they spent 15 years trying to break into the systems at FBI Washington only to discover that the key they were after is stored on an encrypted backup tape at FBI Boston.
At this point I have to stop and ask how much protection are we devoting to the task? Technically, no security system stops a determined adversary - it just slows them down. The idea is to either make the enemy have to expend more time and resources to get their data than they are actually willing to expend, or slow them down long enough to be caught. I used to work for the military in a secure facility a little over a decade ago. We kept our sensitive documents locked in safes. Safes are rated based on how long it would take trained safecracker to break into them. The safes we used were rated at around 45 minutes each - and they were behind a thick vault door rated at about an hour. Why was this acceptable? Because our sensitive documents were all behind that door and split between multiple safes, security walked by the facility every half hour and it took less than two minutes for a large group of armed guards to get there.
While we want to slow down our adversary, at same time an authorized user needs to be able to get to the data when they need it. The use case scenario for the device owner is different for a cop with a warrant. It's acceptable to have a lengthy, somewhat cumbersome process for the police with more security in place, since a cop doesn't need to gain access to a user's phone multiple times a day - they'll generally never have to access the contents, and if they do it's probably only going to be once. An iPhone user isn't going to tolerate getting cryptographically secure sign-offs stating that they are authorized to access the device from multiple secured facilities every time they turn on the device. A cop looking into a serious criminal investigation will.
Now I'd ask who it is that we're designing the system to protect against? How determined and well funded are they? Am I designing a system to protect the user from someone willing to devote a nation state's resources to breaking it, or am I designing a system that will protect a user from data theft by criminals? Apple's existing encryption system won't stop them from getting your data if they really want it - especially when you are physically in China. Walking around with a cell phone in your pocket is itself a huge security vulnerability. If we redesign the iPhone to be a computing device that's indefinitely secure against Chinese intelligence services, it ceases to be a cell phone and instead becomes a standalone computer sitting under armed guard in the basement of the Pentagon.
If you're travelling through China with sensitive data that the government wants, perhaps you should reevaluate storing it on your cell phone.
> In this case, though, it's not the people who are hiding it - the device manufacturer is.
That is not correct. What Apple has done is give control over whether the information is hidden or not to the person with the password for the device.
> Nobody is buying a new iPhone because of the encryption; they're buying it because it's a newer, better version of an existing product - a product that police could search for evidence before if they had a warrant.
You can't win this by making a relativistic argument. The world was not in a state of anarchy before everyone started carrying around iPhones.
I don't think it's an unfair argument. The assault by the trucker happened and evidence of it existed on the phone. In the post-iOS 8 world that evidence would be unavailable. It's the difference between him getting off or not (or likely getting off depending on the circumstantial evidence).
Sure, but this is not being disputed. It's just such an obvious proposition that I'd argue making it adds nothing to the discussion, while taking attention away from the counterpoints that would (and slowly conditioning the reader to think that "who does it allow us to convict that we would like to see convicted?" is the most important benchmark to apply to legal principles).
I don't understand why it would be reasonable to require that the phone have a backdoor which allows them to access the data if they have a warrant for that specifically, but not reasonable to require someone to give up the the password under contempt of court if they have a warrant for that specifically.
Not that I necessarily think either is reasonable, but I don't understand in what way the second would be worse?
I suppose if a person turned out to have not committed a crime, and there was a warrant against them, if they were required to give up their password, they might fail to remember it?
Is that risk worse or lesser than the risk of authorities being able to e.g. get data without a warrant via the backdoor?
Because if there's something really bad on my device, I'd be better off telling the cops "I forgot the password" and risking possible contempt of court rather than them seeing the actual evidence and getting a far worse sentence.
As a European, I'm totally in favor of a mandatory backdoor for the US government. It would do wonders to the development of the European IT industry because many governments and cooperations could not continue using US-based IT products. The amount of hubris required to force a backdoor on a global industry and market is really breathtaking and I think the US would shoot themselves in the foot if they tried doing that. It's not like there is a law of nature that makes software development outside of the US impossible.
Would companies based in the US not ship a different version of their products that are exported? What would it mean for companies outside of the US. A large number of the products and services my company uses are from European countries.
> It's not like there is a law of nature that makes software development outside of the US impossible.
Of course, see what I said above. I think what makes the US attractive for software development is the VC industry and culture there; a quick browse of founders in SF will reveal a ton of immigrants attracted to that culture.
> Would companies based in the US not ship a different version of their products that are exported?
They can of course try that but their clients have little reasons to trust them. Apart from that it might be very costly to develop, test, and distribute two versions of each product and to maintain compatibility, all of which puts US companies at a disadvantage.
There is already plenty of evidence for the idea that the privacy crisis helps the European industry. Many new businesses in several European countries are successfully selling privacy-reserving software and services and their future crucially depends on the trust of their customers, so they have to take privacy serious in ways that US companies don't have to, at least for now.
Not destructive all all, this will rather bring us much-needed diversity to replace a US-centric monoculture. It will also pave the way for the triumph of free software which will prove to be much more resilient against such efforts than proprietary software.
They are designed in the US, but manufactured and assembled in China. The backdoor can be implemented in hardware, firmware or software - anyone who has physical or logical access to any one of a system's components at any point in time can backdoor it. And they probably do.
I second what @tmalsburg2... please, make a backdoor mandatory. 20 years after that moment no one but americans will care because people will have moved on to more egalitarian products.
when the backdoor is covert and not overtly declared the choice of using the questionable product becomes more economic. the choice gets a lot easier when the government or company clearly states they have a backdoor. for many countries it's not just red flag for using the questioned IT it's a reason to fund industries that compete with a better modal.
Europeans voraciously consume American software with backdoors already (Android, iOS, Windows Phone, Windows), even to the point of destroying their own native systems like Symbian and Nokia.
And they do that, with the full knowledge of the NSA spying programs.
Unfortunately for your theory, most Europeans clearly do not care enough about their privacy. I don't see any vast European protests about the NSA these post-Snowden days, most people have gone back to sleep. European leaders have largely been 'reassured' enough to remain docile about it, after a brief pretend flash of anger and push-back.
Point taken. Most do not care about privacy. I would like to see what happens however when backdoors are transparently and honestly declared or legalized. I do think competitors will form in such a market. Will they succeed is a different question.
If the iPhone would turn out to be a major vector for Chinese spyware that would indeed be funny. But you know, if the Chinese use these backdoors to prevent terrorist attacks and to catch pedophiles, we should really be thankful to them.
> “I don’t want a back door,” Rogers, the director of the nation’s top electronic spy agency, said during a speech at Princeton University, using a tech industry term for covert measures to bypass device security. “I want a front door. And I want the front door to have multiple locks. Big locks.”
This kind of Orwellian doublespeak coming from officials in power should give all sane people pause.
But, even if our right to privacy was not enshrined in the Fourth Amendment, it would not be possible for any government to have their intrusion into our private lives enforced in the long term. Computers have the means to afford people true privacy and they will have it, without respect to those who would deny it.
Long term? Smaller and cheaper devices means people are even more easily spied upon. Privacy must be handled as a sacred law, because technology will blow past it. We simply leak too much information as a matter of physics.
Fully agree with this. Technology is merely an amplifying factor here. If a society is healthy, then applications of technologies of the day are non-issues.
Just keep in mind that 1776 would not have been a meaningful year if the founders could not even "assemble" and discuss legitimate grievances without fear of being subject to misuse of state power. This is why we have a 4th amendment.
Consider the software on "your" machine as a mobile agent running on a potentially hostile platform. Now please tell me again how "Computers have the means to afford people true privacy and they will have it, without respect to those who would deny it."
The use of Stingrays (IMSI catchers) and the DEA's collection of international phone records for almost 20 years are more examples of how any power given to law enforcement or the Federal government will be abused, and that abuse will be actively hidden from view.
You can't "find a balance" when the scales are hidden.
And the problem with data is that they store it. You don't just have to trust the current government, you have to trust all the governments for the next 50 years.
At least in Europe, History tells us that a democracy can rapidly turn into a violent dictatorship.
Do you really think, that this can only happen in Europe? (I know, that you meant history, but I am wondering, if humans are so much more democratic in the US? I for my part am sure, that in my country there can be a violent (or covered) dictorship again very soon).
What is interesting in the US is that the composition of the population evolves a lot over time. In Europe too to a lesser degree. One could argue that the US is the oldest (and therefore most stable and rooted) democracy. But as the proportion of the hispanic population grows, from a cultural point of view, Mexico becomes sort of part of the history of the US. And not a long history of democracy there.
As Jacob Appelbaum observes in the documentary Citizenfour, "What we used to call liberty and freedom we now call privacy. And now people are saying privacy is dead."
I think they have already quite well demonstrated their total disregard for the public's privacy. Now that the horse has bolted they now suddenly realise that they have backed themselves into a dark nasty corner, where everything is going to be encrypted within a few days years.
Sorry, but you have to earn trust and they are unlikely to get it back. Just look at the way geeks look at Microsoft.
Yes. There isn't some big debate going on. The feds did a bad thing by looking in people's bedroom windows. Now people are shutting their blinds. It's not a debate. It's a response. The feds can mandate that people open the windows again but then there will be no more undressing in front of them. The peep show is over.
Ugh, that just makes me think that when the NSA will start using "modern graphics" in its slides (since everyone seems to be mocking it over that now) and maybe open source some non-critical spying tools, many will start calling it the "new" (and better) NSA and even start cheering for it.
The NSA already has backdoors[1][2] in your Desktop OS (Windows, OSX), in your mobile OS (Android, iOS), in your smartphone hardware (Qualcomm Baseband), in your ISPs network hardware (Cisco, Juniper), in your server hardware (Dell), in your network hardware (most router firmwares), and there's rumors they have backdoors in Intel and AMD CPUs, too.
Why would they be concerned about encryption that runs on top of hard- and software that they can control at will?
It would certainly be the most expensive and hamfisted possible way for NSA to get access to a target computer system.
The more you know about how encryption software is actually built and deployed in large-scale applications, the sillier this "split key escrow" trial balloon sounds. It is facially ridiculous and will never happen. Good for pageviews, though!
What many here are asserting is basically a form of individual sovereignty, that a person should have the ability to create and share information that the government cannot read. This will never happen. The government is always going to assert that your rights are contingent on the government's ability to violate those rights if it thinks it has sufficient cause according to some standard. Many of you are arguing that the government should completely give up that ability for regular crime that probably happens thousands of times across the country every day.
You are now thinking, what about encryption, PGP, passphrases, I have a legal right to not have to give those up, don't I? Yes, we are very lucky that in the United States the Supreme Court has decided that you don't have to give up a password. But this was because it used to be that a password didn't actually protect very much and the legal system hasn't caught up yet. Lock combinations are meaningless when the government can just crack the safe, and passwords to email are meaningless when they can just compel the email service to cough up the mail. The court's aware of PGP, but the times it actually hasn't been able to get the data through other means is currently small so it's still not worth the political battle. When the device that most people can carry around in their pockets is sufficiently secure that nobody can get the contents without compelling the owner to give up the passphrase, legislation will be introduced for backdoor access or the Supreme Court will rule that you have to give up the passphrase. That is why device LE access is, believe it or not, a compromise. They are willing to overlook nerds using PGP because that's still niche, but encrypted unbreakable devices are an existential threat to the government's ability to control everyone's life for good or for ill.
I still think it needs to be fought as long as possible, but I am under no illusions that we will ultimately lose.
Let's suppose for a minute that there is some legislation put forth forcing encryption to have back doors inserted, or some other sort of key escrow. Why would anyone with the know-how and a privacy concern not just write an implementation of a known crypto algorithm?
Essentially, how would this legislation do anything but compromise the security of law-abiding people while not at all reducing the capabilities of those the US government wishes to stop?
Seriously, it is way too late for this. You can't change the math, which is what crypto is. Now that people know it, you can somehow remove that information from the world.
I don't get it (aka hate it), any sophisticated government adversary (terror groups, organized crime, etc) will simply use the existing strong security measures that the government cannot crack (since they have no escrowed key).
To me that means this measure isn't about the boggy-man of terrorism, etc rather its about everyday criminal investigations. I think people are (overly) willing to give up their liberties when faced with the major boggy-man threats, but I think convincing the public to give up liberty to catch the pot dealer down the street or the dude embezzling from mega-corp is going to be a much harder sell politically.
We need to start having key signing parties. And we need better software to manage and run them.
No key escrow proposal (which the multi-party proposal in this article is) will ever be safe. If the key can be assembled, it can be cached; and as we've seen in the past decade it's very hard for any agency to deny itself surveillance powers.
If the FBI wanted car manufacturers to design seat belts and airbags with murderers and rapists in mind, it wouldn't have taken months of debate to reject the idea. Just because we've entered the domains math and CS, we get months of dithering, contemplating whether or not to force domestic companies to install complex new self-destruct functions into their products.
Al-Qaeda have been using Mujahedeen Secrets — a Windows-based encryption app — since 2007. Legislators should test whether potential new laws can have any impact on such uncontroversially abhorrent software. If, like split keys, they fail the Mujahedeen Secrets test, they can only damage our collective security.
For the sake of argument, if we assume that US government has right to protect its citizens from the monstrous conspiring world, why are they exporting these backdoor technologies to secret services all over the world? Even the countries championing in least human rights, poverty, oppression have these backdoor technologies to use against their citizens. US citizens at least enjoy the freedom of barking, citizen of those other countries don't...
> Hailed as a victory for consumer privacy and security, the
development dismayed law enforcement officials, who said it
threatens what they describe as a centuries-old social compact
in which the government, with a warrant based on probable
cause, may seize evidence relevant to criminal investigations.
This is an interesting perspective. I tend to view this the
opposite way. Until spring 2013, we tended to believe that the
state doesn't rummage around in the private lives of individual
citizens, except for warranted investigations of a few tax
cheats, drug dealers, racketeers, and the like. Snowden's
disclosures were a massive trust-loss event, and both companies
and individuals will be loath to offer any concession to law
enforcement without a clear demonstration of good faith and
limited access to private data.
I'm also glad to see this perspective from the top cybersecurity
advisor at NIST:
> “The basic question is, is it possible to design a completely
secure system” to hold a master key available to the
U.S. government but not adversaries, said Donna Dodson, chief
cybersecurity adviser at the Commerce Department’s National
Institute of Standards and Technologies. “There’s no way to do
this where you don’t have unintentional vulnerabilities.”
Personally, I'm hoping the United States eventually realizes
Dodson is right and that preserving both individual security and
law-enforcement access is futile. Hopefully the United States
will give up on these key escrow / split key schemes and let
meatspace force break encryption on a warranted, rare basis as
investigations require.
> “What we’re concerned about is the technology risks” bringing the country to a point where the smartphone owner alone, who may be a criminal or terrorist, has control of the data [...]
Wow, it's almost like criminals and terrorists would be secure in their persons and effects, as would the rest of us!
Then law enforcement and the TLAs might actually have to do real, actual investigative work. Oh, darn.
It is easy to fall into a nihilistic, hopeless attitude when faced with problems of this size. While this problem can affect anybody, I suspect the these feeling about feasibility of change affect the people that create modern technologies in a specific way: it distracts and hides just how much power we retain, that cannot be collected by the people trying to place themselves at the top of rapidly centralizing power structures.
The power I'm referring to is the power retained by whomever is actually implementing the world we live in. A tyrant can issue all the orders and threats they want - unless someone caries out those orders, the tyrant is just making a lot of hot air. Implementation is what matters in the end, and I think the readers of HN have a pretty good idea who it is that will end up implementing our future.
So the next time someone asks you to implement some bit of the surveillance state, use your power and let some bit of that future go unimplemented. It may be a trivially small thing that may not affect much in the long run. Those little things add up, and the message it sends is important.
I realize that it is hard to take a stand. There is always the possibility of being fired - or worse. Just remember that the engineers working around technology have an advantage over the typical "protest" crowds of the past: the people that want to use the surveillance state need engineers to make their tools. So take advantage of any little bit of power that you have and hold the line in some small way against this creeping tyranny. Above all, remember that tools like this very forum are available.
I'm not saying there is no risk, and I'm not saying there won't have consequences. What I am suggesting is that unless we start making these principled stands right now, the situation is only going to get worse, and the sacrifices required to fix this situation only rise with time.
Please. This is our future, and as clichée as it sounds, there is a lot of truth to the idea that the real power comes form strength in numbers.
==
/* speaking of the power held by engineers... I suspect "work to rule" ( http://en.wikipedia.org/wiki/Work-to-rule ) could be a particularly effective tool in the hands of the people the keep important parts of the world running */
You only need a few sympathetic hardware engineers to put in hardware back doors, however, and it's practically impossible to verify that the hardware you have is trustworthy by the time it's in your hands. A million engineers all making sure they don't implement surveillance functionality can't undo the work done by the handful of bad guys unless they're somehow able to inspect every piece of hardware down to the lowest level and verify manually that it's safe.
The user has to do the same thing when they receive their hardware for obvious reasons.
And software is no different. There are lots of FOSS devs who would refuse to help implement surveillance systems, yet there's no practical way to prevent one small team of bad actors from injecting code somewhere between software-conception and delivery-to-end-user.
The only possible way to be confident about your technology would be to build it yourself from the ground up. You would need to design and fabricate the hardware yourself, from passive components up, and re-implement the entire software stack by hand.
Even if you did all that, if you're a desirable target then you still face a very high risk that one of the people working somewhere in your project will be employed by your opponent and will compromise your system.
The conclusion I have come to is that if you have anything you don't want intercepted, you simply can't use any modern device.
It seems like encryption methods that rely on the government, like some sort of government-run key escrow, are the inevitable solution. Corporations are relatively secure from malicious threats and the government gets the access it wants.
Bad actors won't respect the ban on effective encryption tools. So giving the government keys expends massive resources and doesn't actually solve the problem that it proposes to solve.
giving tools that help law enforcement do their job is a good thing. but if those tools are so easily replaced by current market or future market alternatives then pushing for those tools just kills your industry.
Among my concerns are those working for legitimate political change. I expect that's because I have very vivid memories of Watergate, as well as some memories of the Vietnam War protests.
There's no need to even go back that far. The treatment of David Miranda, Manning's imprisonment, Kiriakou's imprisonment, Snowden's ensured imprisonment should he ever leave the safety of his current umbrella, continued harassment and infiltration of anarchist and leftist orgs in California and Oregon and elsewhere by all levels of law enforcement.
The FBI continues to disproportionately expend resources on environmental groups despite no cases of loss of life from green direct action, while admitting themselves that right-wing militias present a greater domestic threat than even the dreaded foreign terrorism they're so happy to prove they're protecting us from by spending years deliberately radicalizing and then entrapping harmless fools.
Law enforcement motivations cannot be ever trusted when it comes to your privacy.
Law enforcement in its current form can't be trusted period. It's too biased with its own political and financial agendas and does very little in terms of law enforcing and too much of concocting.
If mental healthcare got a sliver of the attention and budgets defense+LE get i personally think we'd all be better off.
Is there really any "security" lost ? I mean, last I checked, the NSA had little to show for all the snooping around (other than possible economic benefits).
Nobody is doubting that giving the prosecutors more powers to produce incriminating material will produce more convictions. Emotionally charged anecdotes like this just serve to distract from the discussion about proportionality and possible abuses that we should be having, while providing no relevant information.