<<The FBI is the opposite in every way, mostly because of budget constraints and the subsequent lack of training. I hope that this is a good learning opportunity for them and a chance for them to increase their training budget in this area.>>
Besides the legal precedents and other associated drama, I think is one of Apple's major concerns, and one of the reasons they implemented the "we don't have the keys" approach to their encryption. If the FBI can always just call on Apple (or Google) to fix whatever mistakes they made, there is little motivation for training / getting better on this front, effectively making Apple the computer forensics arm of the government.
The request they submitted to Apple was clearly written by competent people. They knew exactly what and why they wanted to do, how Apple can help and why only Apple can help.
I think part of that is Apple is/was actively working with the FBI to find alternative solutions. I would bet that the engineers described what would need to happen, i.e. the new OS. As is often the case, the Apple engineers probably documented alternative solutions. The FBI took that "solution" and ran with what they described. It's the "well Apple told us this is the only way to do this, but they won't do it for us" scenario.
> I would bet that the engineers described what would need to happen. ... The FBI took that "solution" and ran with what they described.
I think you absolutely nailed it!
For a high-profile investigation like this, Apple would have given the FBI access to the key developers in the security group. The developers are smart guys trying to be helpful. They are not thinking about Apple policy, or constitutional law, or the big picture of world liberty and privacy. They are tasked with finding the solution to a technical problem: How to get access to protected data.
What likely happened--exactly as you already suggested--is that the FBI asked the developers to explain how the security system could have been designed so as to permit easy government access in cases like this. The FBI was asking "hypothetically" of course. The developers happily gave a blueprint of how the system could have been designed.
The FBI now demands that blueprint be implemented.
Apple should have talked to the FBI through lawyers only.
> For a high-profile investigation like this, Apple would have given the FBI access to the key developers in the security group.
> Apple should have talked to the FBI through lawyers only.
You went from "would have" to "should have", turning your hypothesis into a certainty...
Why wouldn't the developers in the security group think about constitutional law? Have you ever seen an internet forum that talked about computer security regularly, yet did not talk about constitutional law regularly? If not, how would those developers have possibly avoided regular reminders about the 4th amendment?
They didn't have to avoid any reminders. They were most likely just asked "how" it could be done, not to do it. The law comes into effect now, where the FBI is trying to get the courts to order them to comply. Simply telling someone how to potentially do something illegal is not illegal itself, and really doesn't cross any boundaries in my opinion. A white hat hacker uses many of the same techniques that a black hat hacker uses, but in one instance it is legal and in the other it is illegal.
Well, that's the overhead of selling closed-source devices.
If you think about it, consulting vendors is probably a better use of taxpayer money then RE-ing every stupid crypto system on the market.
They contacted Apple, did their homework and came up with specific and generally sane demands. They even went as far as suggesting to perform the hacking at Apple site to ensure that insecure firmware doesn't leak outside.
BTW, this last part looks very much like a response to concerns voiced by Apple, which means that the official statements from both sides are just a tip of the iceberg.
Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to fuck off and write their own firmware. Then the only thing they would need Apple for is signing it once it's complete. And if each user could sign their own firmware updates with a key based on their password or provide their own key then it's game over.
They've put themselves in a weird legal situation because they've made it so that they are the only ones who can actually write and sign the firmware the FBI is demanding. A judge would laugh them out of the courtroom if the FBI was technically capable of writing the firmware and demanded Apple's help because it was too hard.
> Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to fuck off and write their own firmware. Then the only thing they would need Apple for is signing it once it's complete.
This is an example of a non-free software feature. Why are the keys baked in and can't be disabled. And "write your own firmware" doesn't solve this problem -- they could just pay a developer to do it $X an hour. A better security model should've been used -- where updates have to be confirmed (read: signed) by the user before they are applied.
> Spivak 1 hour ago
Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to off and write their own firmware.
No, not based on the interpretation of the all writs act that the FBI is attempting to use. As far as the FBI is concerned, they could force my Grandma to write a backdoor if they deemed her the best person to do so. Given that she can't answer the phone most days it'd be a lon wait, but I wouldn't put t past them.
Poor choice of words, I meant general "closedness" of the platform - from undocumented design, through lack of source code up to centralized code signing.
The only reasonable way for law enforcement to deal with even a single one of those factors is to request help from device vendor.
There is one thing the FBI is very good at, and that's writing a compelling narrative. It's possible that there are highly competently people who know everything, but it's also possible there are moderately but not dazzlingly competent people who are really good at writing a story that feels complete and keeps one from asking questions outside the narrative.
Though, on second thought, I have to add that we don't know how many back-and-forth mail exchanges happened before they were able to come up with the officially published request.
Maybe they were just competent at working around excuses from Apple.
Exactly. Yet another reason to fight the court order. We should expect FBI to be competent, embarrassed if they aren't, and fix the problem. It's not a good state of affairs when a company is more trusted to do forensics.
Software update signing keys, which can't be disabled by the end user. This is what most people would consider "a flawed security model". Even UEFI lets you change the trusted booting keys.
Please enlighten me. Is this not exactly what the FBI is asking for? For Apple to flash a custom version of iOS that doesn't have the software rate-limiting and auto-wipe, which only someone with Apple's private key can do. A four-digit PIN is only secure in combination with those features. Having Apple's code-signing key is in fact "having the keys", except in the most pedantic literal sense.
Besides the legal precedents and other associated drama, I think is one of Apple's major concerns, and one of the reasons they implemented the "we don't have the keys" approach to their encryption. If the FBI can always just call on Apple (or Google) to fix whatever mistakes they made, there is little motivation for training / getting better on this front, effectively making Apple the computer forensics arm of the government.