Hacker News new | past | comments | ask | show | jobs | submit login

> Apple making it abundantly clear, that if they comply (or are forced to comply) with the All Writs Act of 1789 to create this particular back door, then that opens the floodgate moving forward for all sorts of requests to add backdoors/decrease security.

I read it differently. Apple is saying that if they make this particular backdoor, then this very backdoor can also be used in other scenarios, to crack other phones (i.e. the backdoor would apply to all iPhones C, not just to this one).




I interpret as the OP does. The court document asks Apple to lock the particular image to a particular serial number, so if all goes according to plan, the same image could not unlock other iPhones. Obviously, writing security code on a short deadline does not make for the best security, so that's one worry.

But Apple's letter uses the expression "technique", which I think means they're worried the government will get another court to make them change the serial number and sign a new image "next time". Before you know it, Apple will have to have an entire department to make these one-off images. Someone will say, "you know, you could save yourself a lot of time if you just made it work on any phone." Then that image will be leaked, and their security guarantees will be dead. (One might also worry about the DRM implications.)


You and OP are both wrong:

"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."

Apple's argument isn't about a deluge of one-off court orders creating a slippery slope to reducing security. Apple is claiming that complying with just this one request would make Apple's other iPhone users significantly less secure. There would be a piece of software, signed by Apple, that could potentially be used to unlock any iPhone you have in your physical possession.


Here's the exact text of the court order:

"Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE."

How am I wrong?


You said:

"But Apple's letter uses the expression "technique", which I think means they're worried the government will get another court to make them change the serial number and sign a new image "next time""

Apple's letter directly claims that the particular piece of software created to comply with this request will reduce the security of it's users. Obviously this means that Apple does not think that the SIF being hardcoded with the unique identifier of the phone (sufficiently) mitigates the risk.

"make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."

Having re-read the OP more carefully, I think ghshephard is making a different claim than you. He is pointing out Apple's arugment about the 'unprecedented use of the All Writs Act of 1789'. If Apple can be forced to compromise their security via court order like this, the FBI gains the power to force Apple and any other US company to insert backdoors / decrease security.

"If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge."


> would only load and execute on the SUBJECT DEVICE

You're wrong because any image that can be installed on the SUBJECT DEVICE can be modified to be installed on OTHER DEVICES.


The Apple 5C is an older phone not even manufactured by Apple anymore, and, for the longest time, Apple had capabilities that allowed them to brute force iPhones under direction of a court order. Apple has some concern, clearly, about it's customer's security, but doesn't care as much about this particular iPhone Model's security, as it does about the general principle that, without explicit legislation such as [1]CALEA 47 USC 1001-1010), that technology companies such as Apple would then come under the whim and direction of all sorts of requests - there is no explicit restraint in the All Writs Act of 1789 - they can now be told by police agencies to do effectively anything necessary to the pursuit of an investigation - and you know, with 100% certainty, that once the "good cases" are used as an excuse for this, the crappy follow on scenarios will also be making use of the All Writs Act of 1789.

It's massive, massive overreach - and if Apple doesn't draw the line here, it will quickly spin out of control.

[1] https://en.wikipedia.org/wiki/Communications_Assistance_for_...


I don't understand — if the iPhone 5C is so simple to brute force, why aren't Apple simply doing this for the FBI in this particular case? Why request an entire iOS modification when Apple could do what it has done for previous court orders and just brute force the phone.


My understanding is that the iPhone 5C is not _simple_ to brute force as sold from the store. The FBI is asking Apple to weaken the software in this particular one enough for them to be able to brute-force by creating custom software that would allow them to try millions of passcode combinations rapidly.

In the past law enforcement could use their own tools and Apple didn't have any legal way to say "it is beyond our ability to break it so we can't help you" anyway. After their name showed up on that slide in the Prism leak without their cooperation (meaning they had been stepped around by the FBI. Some of the earlier companies had willingly volunteered data), they stepped up their game and deployed end-to-end encryption and secure enclave to have the ability to say 'we can't help' when forced to.

This technique wouldn't be possible on the iPhone 6 due to the encryption keys being in the hardware secure enclave but they are putting their foot down now so that a legal precedent isn't established forcing them to weaken other models too. That's my understanding of it right now.


> After their name showed up on that slide in the Prism leak without their cooperation (meaning they had been stepped around by the FBI. Some of the earlier companies had willingly volunteered data)

I know it's a bit off topic but I'm really curious about this - do you have a source that shows which companies willingly volunteered data and which were stepped around? I wasn't aware that anything like that had come out.


> I wasn't aware that anything like that had come out.

You're not aware of nothing like that ever come out. The guy just made it up (or it's just his wishful thinking). We still don't know which companies cooperated with the NSA.

We do know that Google didn't intend for its traffic between data centers to be scooped up - and that has been fixed in the meantime - but that doesn't prove that Google didn't cooperate in other matters.

Same with Apple. For all we know, they are all gagged due to NSLs.


Because it sets an awful future legal precedent. If Apple did it once and didn't challenge it, you can be sure that all future government cases involving decryption will cite this case as the new standard of law.


You're both accurate here:

> The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Once there's a backdoor, the legal precedence and technical capability will exist to use it on any device. The precedence would also exist to request support in backdooring other parts of the OS.

It's FBI Director Comey's explicit goal[0] to destroy the notion of strongly secured encryption for civilians. From an an address to Congress July 2015:

> Thank you for the opportunity to testify today about the growing challenges to public safety and national security that have eroded our ability to obtain electronic information and evidence pursuant to a court order or warrant. We in law enforcement often refer to this problem as “Going Dark.”

[...]

> We would like to emphasize that the Going Dark problem is, at base, one of technological choices and capability. We are not asking to expand the government’s surveillance authority, but rather we are asking to ensure that we can continue to obtain electronic information and evidence pursuant to the legal authority that Congress has provided to us to keep America safe.

In other words, encryption makes it harder for the FBI to collect people's information. They therefore want to make sure encryption as implemented can't block the FBI.

Further on:

> The debate so far has been a challenging and highly charged discussion, but one that we believe is essential to have. This includes a productive and meaningful dialogue on how encryption as currently implemented poses real barriers to law enforcement’s ability to seek information in specific cases of possible national security threat.

[...]

> We should also continue to invest in developing tools, techniques, and capabilities designed to mitigate the increasing technical challenges associated with the Going Dark problem. In limited circumstances, this investment may help mitigate the risks posed in high priority national security or criminal cases, although it will most likely be unable to provide a timely or scalable solution in terms of addressing the full spectrum of public safety needs.

Encryption, when implemented in a way that legitimately secures a person's data from unauthorized access, the FBI can't just get in and take the data. Comey would like Congress to support policy and tools that can get around that, because terrorism.

The Apple situation feels very foot-in-door to me.

0: https://www.fbi.gov/news/testimony/going-dark-encryption-tec...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: