Hacker News new | past | comments | ask | show | jobs | submit login
A Message to Our Customers (apple.com)
5771 points by epaga on Feb 17, 2016 | hide | past | favorite | 967 comments



Huge props to Apple - here's hoping against hope that Google, Facebook, and Amazon get behind this.

One thing I was wondering is how Apple is even able to create a backdoor. It is explained toward the end:

"The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer."

This is actually quite reassuring - what this means is that even Apple can't break into an iPhone with a secure passphrase (10+ characters) and disabled Touch ID - which is hackable with a bit of effort to get your fingerprint.


".. what this means is that even Apple can't break into an iPhone with a secure passphrase (10+ characters) and disabled Touch ID - which is hackable with a bit of effort to get your fingerprint."

That is not exactly true. They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.

From what I understand Tim is doing, and I greatly admire, is trying to avoid a judicial requirement that they be able to do this on demand. The so called "back door" requirement, because he knows, as others do, that such a feature would be used by more than the intended audience, and for more than the intended uses, to the detriment of Apple's users.

What I really find amazing is that I was at a talk hosted by the East-West Institute where the Air Force General of the new cyber command (whose name escapes me) complained that "we" (silicon valley) had let the government down by not writing strong enough crypto to keep our adversaries out. I remarked that it was the ITARS regulation and the Commerce department at the behest of the NSA which had tied our hands in that regard, and that with a free reign we would have, and could do, much better. Whit Diffie was there and also made the same point with them. And now, here we are 10 years later, and we "fixed" it, and now its our fault that they can't break into this stuff? Guess what? Our adversaries can't either!

The right to privacy, and the right of companies to secure that right with technology for their customers, is a very important topic and deserves the attention. I am really glad that one of the most valuable companies in the world is drawing a bright line in the sand. So I really support Tim's position on this one.


> That is not exactly true. They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.

No, they can't. A quick update to recent hardware practices: modern SoCs like Apple's have something called "Secure Enclave Processor" that's on-die. This is the first thing to start when the chip is powered up, the thing that loads a cryptographically-signed bootloader, and the thing that gates a lot of IO with the outside world (like the NAND).

Hardware encryption on consumer hardware has existed for over a decade (look up Intel's TPM), and while it hasn't obviously taken hold on the more open Intel world, locked-down hardware platforms like Apple's top-to-bottom design has had much more liberty in implementing secure computing.

Furthermore, all debug/probing features can be disabled by fuses at the factory. The manufacturer can test the chip with those features on, and once verified, blow those fuses. No-one's JTAG-debugging that chip, not even Apple.

That said, Apple's focus on security and privacy ramped up in recent years. You want more secure, get more recent hardware. The downside, of course, is that if even Apple can't hack the software... neither can you.


"Computationally-signed bootloader" - but does Apple have the private key? If so, why can't they create another bootloader?


As I understand it, if the code running in the "secure enclave" (containing the private keys) is ever upgraded, the hardware side intentionally deletes the private keys as part of the upgrade, whether the upgraded code would want it to or not.



The reasoning there is far from conclusive. The argument is that the secure enclave has been updated in the past (to lengthen enforced delays) without wiping user keys.

However, without more information, this does not tell us whether it is possible in this case. The obvious implementation for a secure enclave resisting this sort of attack is to only allow key-preserving updates when already in unlocked state (which would be the case for any normal user update). All other cases should destroy the user keymat, even if the update is validly signed by Apple. This would be done by the hardware and/or previous firmware before it loaded the new firmware so you can't create an update that bypasses this step.

If this isn't how the secure enclave works now, I'll bet it will be in the next version (or update if possible).


You can't update the firmware on the SE without unlocking the device first.


We assume.

I'm also confused by a lot of this since don't you need the password anyway to upgrade?


> We assume.

I bet if Apple is forced to comply with this order they will make sure that they will find a way to design the iPhone such that they physically can't comply with similar requests in the future.


> No, they can't. A quick update to recent hardware practices: modern SoCs like Apple's have something called "Secure Enclave Processor" that's on-die.

Yes, they can. The particular phone in question is from before the Secure Enclave Processor


To play devil's advocate: If they can be compelled to produce de-novo firmware for the purpose of data extraction they could also be compelled to design the means necessary to extract the data from the secure enclave, e.g. by prying the chips open and putting it under a scanning tunneling microscope.


The packages on those chips are usually designed to be melded to the underlying structures enough that opening them destroys the keys they're attempting to hide.


I think you're missing my point here.

Some people trot the argument that it's OK for the government to compel apple to deliver the backdoored firmware because the measures it would circumvent are not of cryptographic/information-theoretical nature.

Then one could expand that argument by saying that compelling physical reverse-engineering is also OK because the devices are not built to be physically impossible (read: laws of nature) to pry open.


The devices ARE built to be physically impossible to open without destroying their contents.


In security, impossible usually means "there is no documented method yet".


> They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.

When a passcode is entered, the SoC queries the Secure Enclave with the passcode. If the passcode is correct, the Secure Enclave responds with the decryption key for the flash storage.

The best Apple could do is sign a malicious update to the Secure Enclave firmware that either removes the time delays or dumps the keys. However, some people suspect the SE erases its secrets on firmware update, although this behavior isn't documented in Apple's security reports.


> The best Apple could do is sign a malicious update to the Secure Enclave firmware that either removes the time delays or dumps the keys.

Dumping the Secure Enclave would not result in the keys necessary to read the files on the filesystem. Each file has a unique key, which is wrapped by a class key, and for some classes, the class key is wrapped by a key derived from the passcode. If you don't have the passcode, you can't unwrap any of the keys (Page 12 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf).


You could bruteforce the passcode.


Yes, if you could recover the SE key (or factory-burned SoC key for older phones), you could crack the passcode on your own cluster rather than being limited to what the phone hardware/software can do.


> for older phones

To my knowledge, all keys are still wrapped with the UID, and the UID is still a factory-burned SoC key (not accessible to any firmware). Possible to extract, but not easy to do at scale.


The phone involved in the San Bernadino case is a 5C, which does not have Secure Enclave.


Where have you heard about the erase-on-update feature?


The SE doesn't erase-on-update, but you can't update it without unlocking the phone first.


>>> I am really glad that one of the most valuable companies in the world is drawing a bright line in the sand. So I really support Tim's position on this one.

Tim's position today might not be apple's position tomorrow. Apple is a large publicly traded company. They owe a duty only to shareholders. Fighting this fight will probably impact the bottom line. Tim's continuation may turn on the outcome.

Cooperation may see Apple hurt. The perception of cooperation was part of RIM's fall from grace. Non-cooperation may also cause issues. Through it's various agencies, the US government is Apple's largest customer, as it is Microsoft's. Large contracts might be on the line should Apple not play ball. Either way, this order has probably wounded Apple.


> the US government is Apple's largest customer

I'm having trouble finding numbers, but I seriously doubt this. The reason that's true (or more likely true) for Microsoft, is Windows. The US gov't has massive site licenses for Windows and most of MS's software portfolio. Apple is used where in the US government? Some cell phones? A few public affairs offices that convinced their purchasing officer to buy a Mac Pro for video editing? Maybe some labs that wanted a unixy OS and, again, convinced their purchasing officer to buy a Mac Pro?

Per: http://investor.apple.com/secfiling.cfm?filingid=1193125-14-...

The bulk of Apple's revenue comes from outside the US. Perhaps the US government is their largest single customer (I still hold this is a dubious claim), but it is not essential to their continued existence. They would do just fine without those sales.


I agree with your general point (which I take to be, Apple is not seriously threatened by loss of sales to the US Government).

But remember, iPhones and MacBooks are quite popular everywhere, including US government procurements (e.g., https://37prime.wordpress.com/2012/08/05/nasa-mars-science-l...).


My experience is primarily DoD, which is the bulk of the government's spending on this sort of thing. For enterprise spending, Apple doesn't get much love, outside a general trend towards iPhones for "company" phones, but even that's rare, with people using their own phones instead of being issued one a lot of the time (they can get their plans paid for or subsidized if the phone is a requirement of the job). Labs often end up outside a lot of the enterprise type procurements and so have a bit more leeway in that regard, but while they spend a lot of money, it's still a drop in the bucket for Apple.


I do agree with your general point.

Your characterization ("Maybe some labs that wanted a unixy OS and, again, convinced their purchasing officer to buy a Mac Pro?") is a little off -- where I work, MBP's for laptop replenishments are treated exactly the same way as Windows systems, you just tick a different button on the order form.


Fighting will probably impact the bottom line.

As in, improve it.

Tim Cook is probably more popular than Obama (and surely is WRT this issue.) Apple is about a thousand times more popular than the NSA and blessed with almost infinitely deep pockets and a very, very good marketing team.

Not to mention the fact that most of the people who use computers and phones don't even live in the USA.


Unfortunately, the NSA/FBI likely have far more money than Apple, and if Apple spends too much, their shareholders can demand that leadership backs down.

Tim Cook is responsible to his BoD and shareholders.


The NSA has an annual budget of ~10B, the FBI even less.

In comparison, Apple had a net income of ~50B in 2015.

Let that sink in for a moment.


> Large contracts might be on the line should Apple not play ball.

That almost certainly isn't the case. It is doubtful whether any other government organization cares about how they handle this case. Heck the FBI likely doesn't care as long as Apple doesn't do anything illegal.


Lots of places OUTSIDE the US will care, though. This is exactly the sort of $hite that is causing European companies and governments to avoid dependencies on US providers: there's no way to garauntee freedom from US government surveillance.


They owe a duty only to shareholders.

Because Freedom Markets(tm), booyah!

Society can bumble along just fine without corporations. Corporations serve society.

Take away society, with its culture, laws, rules, regulations, courts, people, economy, markets, capital, etc, there can be no corporations.

The Shareholder Fallacy http://www.salon.com/2012/04/04/the_shareholder_fallacy/

Historically, corporations were understood to be responsible to a complex web of constituencies, including employees, communities, society at large, suppliers and shareholders. But in the era of deregulation, the interests of shareholders began to trump all the others. How can we get corporations to recognize their responsibilities beyond this narrow focus? It begins in remembering that the philosophy of putting shareholder profits over all else is a matter of ideology which is not grounded in American law or tradition. In fact, it is no more than a dangerous fad.

The Myth of Profit Maximizing

“It is literally – literally – malfeasance for a corporation not to do everything it legally can to maximize its profits. That’s a corporation’s duty to its shareholders.”

Since this sentiment is so familiar, it may come as a surprise that it is factually incorrect: In reality, there is nothing in any U.S. statute, federal or state, that requires corporations to maximize their profits. More surprising still is that, in this instance, the untruth was not uttered as propaganda by a corporate lobbyist but presented as a fact of life by one of the leading lights of the Democratic Party’s progressive wing, Sen. Al Franken. Considering its source, Franken’s statement says less about the nature of a U.S. business corporation’s legal obligations – about which it simply misses the boat – than it does about the point to which laissez-faire ideology has wormed its way into the American mind.


>>In reality, there is nothing in any U.S. statute, federal or state, that requires corporations to maximize their profits.

Laws and statutes don't enforce contracts. But courts do. You are trumpeting a theory I've heard many times before. It's creators lack a basic understanding of contract law or corporate organization. :ookup "shareholder derivative actions".


In a lot of consumer devices, JTAG is at least partially disabled (sometimes they can throw a fuse that only lets you do boundary scan for manufacturing).

I would not be surprised at all that Apple's internal 'backdoor' (if you can call it that) is just resetting the security enclave, essentially erasing everything on the NAND. That'd be fine for refurb/manufacturing, desirable even as that guarantees that full system wipes happen before a refurb goes to a new customer.


There are no more actual fuses, but JTAG access can be limited or completely disabled on most (or possibly all) modern ARM chips by setting a value in the non-volatile (flash or EEPROM) memory. Even the IC manufacturer (Freescale, ST, etc.) can be locked out completely.

Most ICs can be completely erased to remove the limitations on access, but this usually requires a 'mass erase', where the entire non-volatile memory is erased (taking any codes, passwords, and encryption keys with it).

source: I am an embedded software engineer who works with these settings in bootloaders and application software.


> That is not exactly true. They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.

Is this true? That would have to mean that either the passphrase is stored on the device or that the data is not encrypted at rest. Neither of these sound likely, frankly


It doesn't have to mean any of that. As long as the 10 mistakes limit is enforced in software or in a separate chip that can be replaced without replacing the actual encryption key, it can be bypassed. Then it is a simple matter of simply brute-forcing the pin code. Since these are usually 4 digits, there's only 10000 possibilities, which is laughable to a brute force attacked.


They don't even have to do that. They wrote the OS, they have the signing key for OS updates. All they need to do is push an update to the device with a backdoor that allows reading off the unencrypted contents post-boot (possibly with the addition of a judicially compelled fingerprint scan or PIN brute force to get the encryption key out of whatever on-device escrow it's stored in).

The only way to secure the device against that would be to have the users manually memorize and key in a complete 128 bit encryption key, which they could then refuse to provide.

I think we tech folks were fooling ourselves with the idea that Apple had somehow delivered a "snoop-proof" device. They really didn't (no one can!) as long as they're subject to government or judicial control.


This is not really true. The secure enclave is a separate computer. It doesn't get software updates.

> possibly with the addition of a judicially compelled fingerprint scan or PIN brute force to get the encryption key out of whatever on-device escrow it's stored in

This is the whole problem. The keys are in the SE. You can't brute force the PIN because the SE rate-limits attempts (and that rate limiting cannot be overridden by an OS update because the SE is not run by the OS).

If you can get a fingerprint scan then all bets are obviously off, but then you don't need Apple at all.


The device in question does not have a secure enclave. It's a 5c.


But it's more interesting to think about the case where the phone does have a secure enclave.


and yet it would be less interesting to consider if the password was a "six-character alphanumeric passcode with lowercase letters and numbers" because even if the software rate-limiting was disabled with a rogue firmware update, the PBKDF2 or similar iteration count makes brute-forcing impractical.

> A large iteration count is used to make each attempt slower. The iteration count is calibrated so that one attempt takes approximately 80 milliseconds. This means it would take more than 51⁄2 years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers

(Page 12 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf).


In that case, they could just bring the phone down to the morgue and unlock it with touch id.


Touch ID does not work after a reboot, to many attempts or a long of delay.

Additionally you don't have to use your thumb, so if you don't know what body part was used your out of luck.


If the phone is rebooted or if 48 hours have passed since the last passcode was entered, the touch ID can't be used to unlock the phone.


Touch ID requires the passcode after reboot or timeout. They'd need to do it very quickly.


Fingerprint scan really isn't enough though since after a restart, iPhones require the password to be entered.


Because fingerprints are not sufficient to generate / recover KEKs from.


Is that really true? The enclave's firmware is in ROM and non-upgradeable? I'd always assumed it got a signed blob like everything else does. Obviously it's possible to design a system like that, I just haven't seen it reported anywhere that it actually works like that.

Edit just to be clear: the requirement really is that the firmware be stored in a ROM somewhere, probably on the SoC. That's a lot of die space (code storing a fully crypto engine isn't small) to dedicate to this feature. Almost certainly what they did is put a bootstrap security engine in place that can validate external code loaded from storage. And if they did, that code can be swapped by the owner of the validation keys at will, breaking the security metaphor in question.


According to a former apple engineer that worked on this stuff, the enclave's firmware is indeed a signed blob:

https://twitter.com/JohnHedge/status/699882614212075520

The key thing would be for it to lose all stored keys on update when the current passphase has not been provided, and it sounds like that may not currently be the case.

Maybe in this case, Apple could comply, but a simple tweak would make it impossible in the future?


But "on update" isn't really the issue. If the code can be swapped underneath it, how does it know an "update" took place? Again you're in the situation where all of that process would have to be managed by hardware, when what is really happening is that the enclave is a computer running signed software that can be replaced.


Sure, but the signed firmware could be written to delete any stored keys when it accepts an update and the phone's passphrase has not been provided. That's assuming that it manages it's own update process, has the firmware securely stored within it's own die, etc. It's entirely possible ... but only Apple really knows.

I would not actually be shocked if they originally did wipe out stored info on firmware update, but had some issues with people updating their phone and losing everything, so they ifdef'd that particular bit out in the name of usability.


But... the firmware is stored in external storage. How does it even detect that an update was performed? You're saying that if the hardware had affirmative control over all the secure storage (e.g. the RPMB partitions on eMMC, etc...), then it could have been written to do this blanking.

Well, yeah. But then you'd have a system that couldn't be updated in the field without destroying the customer data (i.e. you'd have a secure boot implementation that couldn't receive bug fixes at all).

It's a chicken and egg problem. You're handling the problem of the "iPhone" not being a 100% snoop-and-tamper-proof device by positing the existence of an "interior" snoop-and-tamper-proof device. But that doesn't work, because it's turtles all the way down.

Ultimately you have to get to a situation where there is a piece of hardware (hardware on a single chip, even) making this determination in a way that has to be 100% right from the instant the devices go out the door. And that's not impossible, but it's asking too much, sorry. We're never going to get that.


It's certainly possible to design such a system with external firmware and still allow for secure updates.

The enclave would store (in secured storage) a hash of the last used firmware. Hardware would have a hash update capability, but this destroys all other stored information (i.e., keys) if used when the enclave is not currently in an unlocked state.

On boot, hardware verifies firmware signature as usual but also compares the firmware hash (already calculated for the signature check) to the stored value. If there is a mismatch, update the stored hash. Since the enclave is currently locked, the hardware clears the keys.

Since it's in hardware, you're correct that it would have to be 100% right, but that's quite feasible for a simple update mechanism (indeed, the most complicated bits are reused pieces from the signature check which already has this requirement).


Have it store the firmware itself encrypted with the UID. It never leaves the secure enclave so only the secure enclave itself could "sign" updates. You could still allow for recovery by providing a method to reset the UID.


Having the hardware require both the signature to pass as well as a valid PIN/fingerprint can't be that difficult.

Die space is cheap nowadays, especially stuff that doesn't need to be on all the time because of the death of Dennard scaling.


I'm not claiming it is in ROM or that it is not upgradeable if you are Apple and have physical access to the device. I'm not sure on that point. What I think must be the case is that Apple can't remotely upgrade the SE firmware as part of its iOS update mechanism. Although, to be perfectly honest, I have not seen this explicitly documented.


So... it sounds like you more or less agree with me. Apple can comply with this court order and open your secure device. We just differ as to whether they can do it over the air.

(FWIW: OTA firmware updates are routine in the industry. I've worked on such systems professionally, though not for Apple.)


The "bootstrap security engine" could store a hash of the blob in it's secure flash storage, and only update the hash if the user enters their pin, then reload the firmware blob. If the stored hash and blob hash ever don't match on boot, wipe the keys.


How can they simply "push an update"? I've never seen iOS auto-update without first prompting the user, which I'm assuming is a very intentional limitation.


Probably by using DFU, and uploading the image via the lightning cable.


Except that wipes the device.


The Air Force and the FBI are different organizations wth different missions. I suppose they are both under the executive branch.


"From what I understand Tim is doing, and I greatly admire, is trying to avoid a judicial requirement that they be able to do this on demand. The so called "back door" requirement, because he knows, as others do, that such a feature would be used by more than the intended audience, and for more than the intended uses, to the detriment of Apple's users."

To be fair - the only reason he's doing it is because it would cause a significant drop in sales for Apple devices. People overseas would immediately stop buying because "The American government is listening", and that's assuming countries like China and Russia wouldn't ban them outright.

This is the big thing American politicians are missing or glossing over in their campaigns to get re-elected: Forcing American companies to compromise their products will result in a significant loss of revenue overseas. Microsoft, Google, et al, have already reported it and foreign governments have already started banning goods/services (due to the Snowden revelations).


"To be fair - the only reason he's doing it is because it would cause a significant drop in sales for Apple devices."

That's not being fair at all. To say the only reason he is doing it is to protect iPhone sales doesn't speak to Tim's character. Of course he cares about sales, but he also cares about privacy.


It's definitely one of those rare times doing the right thing is also the most profitable thing.


Doing the right thing is very often (if not almost always) the most profitable thing. It's very difficult to make a business out of serving your customers poorly; if you disagree, give it a shot, and let me know how it turns out for you.


Serving your customers well is good for profit, but it is not the same thing as doing the right thing. Ad-based companies have customers and users. By serving their customers too well they easily screw over the users, with obnoxious and even unethical ads.


I've got a set a special hex screws and soldered on RAM to talk to you about...


That seems like the classic mistake of believing that the larger market assigns importance to repairability. I haven't seen much evidence this is true.


It's not necessarily the most profitable thing. Apple is picking a fight with a very big adversary. This takes some serious backbone.


I promise you, even a protracted legal battle is far cheaper than significant global sales losses.


Normally people on HN are much more skeptical about airy promises and assertions from corporate executives. I don't see what behavior on Tim Cook's part has indicated he's more to be trusted than anyone else.


Tim Cook was asked at a shareholder meeting to only do things that were profitable. He passionately rejected the idea, and named accessibility for the blind, environmental issues/climate change, and worker safety as areas where Apple invests because it's right, without considering the "bloody ROI". [1]

Compare to GE, which rolled over [2].

So I believe Mr Cook when he says his opposition to the FBI's request is rooted in a desire to do the right thing, and not the bottom line.

[1] http://www.macobserver.com/tmo/article/tim-cook-soundly-reje...

[2] http://www.nationalcenter.org/PR-GE_Climate_Change_022114.ht...


Just to sharpen your comment, here's the summary of that interaction with a shareholder:

[Cook] didn't stop there, however, as he looked directly at the NCPPR representative and said, "If you want me to do things only for ROI reasons, you should get out of this stock."

That's a very blunt statement that the immediate stock valuation is not Cook's only consideration.

Some people seem to have a hard time taking Cook at his word, but he's been quite consistent. This massive skepticism feels more like nostalgie de la boue than anything based in facts.


I think the only problem here is the strongness with which it is worded. It would be fair to say one of the reasons he is doing it is because of stock price, but to assert the only reason he is doing it is to cast him as completely uncaring about security and privacy. Without evidence to back this up (in this case, evidence that he doesn't care about security and privacy), it's an attack on his character. It's entirely possible his reasons include both, and that the moral sentiment behind the message is entirely truthful.

We should be careful about plainly stating what someone else's motivations are when it contradicts their own story.

Edit: s/it's/his reasons include/ for clarification.


He's an older gay man, I would be shocked if this didn't influence greatly his opinion in this realm. He's lived through some times that weren't too friendly to "his kind" that were open.


He's pretty much said so:

We still live in a world where all people are not treated equally. Too many people do not feel free to practice their religion or express their opinion or love who they choose. A world in which that information can make a difference between life and death. If those of us in positions of responsibility fail to do everything in our power to protect the right of privacy, we risk something far more valuable than money. We risk our way of life.

See pages such as http://qz.com/344661/apple-ceo-tim-cook-says-privacy-is-a-ma...


Growing up a target sure changes the way you look at everything.


Not more to be trusted than anyone else, perhaps. But the claim was, "the only reason he's doing it is because it would cause a significant drop in sales for Apple devices" (emphasis added). I can trust him no more than I trust anyone else, and still not be totally cynical as to his motives. They're mixed, they're not completely altruistic, but they're (probably) not completely mercenary, either.


"more to be trusted than anyone else"

Cook's responsibility is first and foremost to the stockholders, and secondarily to the customers. Decrypting the iPhone would seriously compromise the security of Apple's products, gravely damage the company's credibility, hurt sales, and drive the stock price down.

No CEO is going to take such a drastic step unless they are a craven, cowardly type who meekly obeys ask-for-the-sky demands from overbearing federal law enforcement types, and Cook surely did not rise to his current position by being a pushover.

That's not to say there won't be some kind of secret deal made behind closed doors, but secrets tend to get out. Apple would not be so foolish, I think. Yahoo? Microsoft? They just handed over the keys to their email to anyone who demanded it -- the Chinese government, the NSA -- but Apple has no history of this type of behavior. Surely Snowden would have revealed it if they had.


Why would it cause a significant drop? Where would those people go?


Not saying I necessarily agree (or disagree), but the premise is that right now iOS phones are the only ones that do security right, out of the box. They're worth the premium price for that feature. As soon as they no longer have that edge, there's no reason to choose them over any commodity Android device, so sales would drop as they lose their differentiating feature.


Honestly, I doubt that a significant number of people is buying the iPhone for its security features.


Yeah, I don't think so either. Just explaining the parent's line of reasoning.


Just like people buying Beats headphones for sound quality (they're not).

People that really care about security don't use smartphones.


Not that they're more intuitive to use, have a higher quality app ecosystem, and "just work" for most people.


They'd go to Android. Apple only has a significant share of mobile users in the USA. Most other countries they are losing (globally they have something like 8% vs Android's 85% market share).


>Most other countries they are losing (globally they have something like 8% vs Android's 85% market share).

I'm not sure this is true once you factor in price range. The general knowledge is a lot of those Android devices are sub-$250.


> Most other countries they are losing

Maybe by share of units shipped. By revenue share they dominate, and their margins are estimated to be very good.


You're probably right, but it's great to explain this decision in economic terms. It'll sink through to people who think privacy is only for "good" actors, and those who don't like the government hurting businesses.


Tim answers to shareholders. Shareholders look at the bottom line and not his character.


This is a good way to think about things skeptically, but to say it as though it is fact is misleading. Is it that unlikely that a business professional can frame his beliefs and knowledge in this domain in such a way to justify himself to the shareholders? Or are we all just too skeptical that people in power act against morality whenever easy or possible?


Shareholders, which includes Tim Cook, all get to choose what they care about. Humans have complex motivations.


Let's suppose you can see in to Tim Cook's heart and this is true.

So?

I understand wanting to know people's motivations, from both the perspective of predicting future action and just because we're nosy monkeys. But frankly, what's in Cook's heart doesn't matter. Actions do. And to date, in my view, he's done pretty much exactly the right thing on this issue all along.

Maybe he's defending customer privacy because he believes the Lizard People have religious objections to invading until all humans have Freedom of Math. It doesn't simply matter.


Being the vocal advocate to stand up to the gov't IS risky. The wording is carefully crafted to prevent spin damage (pro-terrorist, anti-law enforcement).


> the only reason he's doing it is because it would cause a significant drop in sales for Apple devices. People overseas would immediately stop buying because "The American government is listening", and that's assuming countries like China and Russia wouldn't ban them outright.

That hasn't happened with other devices or earlier iPhones that aren't as secure.


I don't see how this "reassuring"; to me it's rather very confusing (as mentioned in many other comments).

If Apple could in fact write a software backdoor, doesn't it mean that the backdoor exists, at least potentially?

And how can one be sure that Apple is the only company able to build that door? At the very least, couldn't the right Apple engineer be either bribed or forced (by terrorists or the government) to build it?

"Impossible" should mean "impossible", not "not yet done, but possible".


What it means is that the best the FBI can come up with is "Make a way for us to brute force attack the passphrase." And brute force attack is worthless for a strong enough passphrase. That's what is reassuring.

Not to mention that this is for the iPhone 5c. As other comments have mentioned, newer iPhones have the hardware-based Secure Enclave which add to the difficulty of breaking into the phone. https://www.apple.com/business/docs/iOS_Security_Guide.pdf


My reading of that PDF is that Secure Enclave software can be updated too, it simply does an independent verification of the Apple's digital signature.

So while the Secure Enclave enforces the delay between brute force attempts, Apple could still release an update that removes that delay.


It could also be that the FBI has already obtained the info, or can obtain the info through illegal means and simply want to use the issue to force legislative and policy changes or public opinion changes in their favor, and or just want to legitimize their having the information.


In the world of cryptography, it is always possible, because you can always be lucky and guess the right "unlock" code. In fact, social engineering is normally used to find the right "unlock" code[0].

The FBI can also unsolder the components in the phone, make a full image of the content, find the encrypted section and then brute-force. This is what is done for SSD. They do not power up the drive, unsolder, put the memory modules in a special reader and copy the data before the controller of the SSD automatically wipe out data because of automatic optimization after a delete/trim.

[0]: https://xkcd.com/538/


It's kind of hard to social engineer dead people, though.


This is why the "normally" in my sentence. But the point I really wanted to make is that you have no "impossible" with encryption.


dead people were once alive. go through their pockets or their apartment.


You can go through each and every physical object I own or even was in contact with, but you won't find any of my passwords


What happens to all your stuff when you die?


I guess my family will take care of my physical stuff. For the online part, some of it can probably be handled through support (facebook, etc...), and the rest will stay as is until it is deleted for lack of use. Or never deleted. Both are okay.

Leaving a physical trace of my passwords is not only bad practice from security point of view, but quite useless since I know them. Also, my online accounts are useless if I can't use them because I'm dead, so I don't really care if no one can access them anymore. What happens to important things such as banking is already dealt with.


I just got a Facebook birthday notification for my cousin, who died three years ago. So, that has some impact on me and others in our extended family. Maybe that's ok, maybe it'll get weirder at some point; but its definitely something to think about.


You can have FB set the page as a memorial, and I believe that also gives you the option to disable all those types of notifications.


That's a very real issue indeed. Sorry for your loss.

But I think the right solution here would be for Facebook to have a way of handling deceased people, not giving your password to everyone in case of sudden death.


They do. If someone contacts Facebook and let's them know that a person has died (with some sort of verification), they will memorialize the account.

https://www.facebook.com/help/103897939701143


If facebook doesn't want to delete his account, why don't you unfriend him?


You can have the account memorialized (and/or removed): https://www.facebook.com/help/150486848354038


Would you?



Why care about "stuff" once you are dead?


Because you don't want to make a difficult situation even harder for your relatives?

See, for example, the people who know they're going to die and who leave their iPads to their relatives in their wills. Apple doesn't take grants of probate as sufficient legal documents (everyone else does (eg banks)) and insist on a court order.

http://www.bbc.co.uk/news/technology-26448158


Significant other? Kids? Relatives?


Well your opinion is kind of moot at that point.


With a $5 dollar wrench? What's that going to get you?


The media already did that :P


From a forensics point of view that is a very risky path. You risk destroying the evidence and you're tampering with it.


It's not a backdoor, it's a frontdoor. In cryptography, there's no way to make repeated attempts more computationally expensive. The lockout just an extra feature Apple put on, that Apple could easily remove. If we're going to have 4- and 6- digit PINs, there is no way to stop a dedicated attacker frome brute-forcing it. None.


You can't make crypto behave slower on repeated attempts but you can still make each attempt more expensive. For example: https://en.m.wikipedia.org/wiki/Pepper_(cryptography)


True. But Apple, with such a focus on UX, cannot reasonably afford more than ~200 millisec when checking a password; and still it scales linearly, so the solution for concerned users still involves creating a more complex password. Doubling the amount of time it takes to hash a password will have the same effect as adding 1 more bit of entropy to the password, which can easily be beaten by adding a single character to it.


If you consider caching of keys, there's no reason that the first login attempt after a cold boot couldn't take 1-2s. Each subsequent login would be roughly instant.


"there's no way to make repeated attempts more computationally expensive"

That's not true actually. For example, the industry standard for storing passwords on a server (bcrypt) is specifically designed to slow down password match attempts.


It is true. You're confusing making _repeated_ attempts progressively more expensive with making all attempts more expensive to start with


Ah yes. You are right, I was confusing those two things. Thanks for the clarification!


Bcrypt isn't an industry standard.


  > no way to stop a dedicated attacker from brute-forcing it
Wipe after x incorrect? Can't stop the attacker, but you can make it futile, surely.


Nope. If the attacker (Apple in this case) can replace the OS, they will just do so before the phone gets wiped—replacing the OS will remove that wipe feature.


Not if the check and wiping is done in hardware as claimed by Apple for newer devices than the one in question here.


Meh. Then the attacker can simply replace the hardware. Remember, our attacker model is Apple; non-cryptographic security measures mean very little to a company with such complete knowledge of the hardware and software involved.


Nope. On newer devices the key is derived from a random key fused into the SE during manufacturing, a key fused into the ARM CPU, and a key randomly generated on-device during setup (derived from accelerometer, gyro, and altitude data) and stored in the SE. The SE's JTAG interface is disabled in production firmware and it won't accept new firmware without the passcode.

You can't swap the SE or CPU around, nor can you run the attempts on a different device.


Can't you? Seems like the kind of problem you can point an electron microscope at, and perhaps some very high precision laser cutting. In any case, I imagine if you are willing to spend resources on it, you could read the on-chip memory somehow and start cryptoanalysing that.

Against a sufficiently capable adversary, tamper-resistance is never infalible, but good crypto can be.


    > Against a sufficiently capable adversary, tamper-
    > resistance is never infalible, but good crypto can be.
Nonsense, it all comes back to "sufficiently capable", every time.

To a sufficiently capable adversary, _all_ crypto is just "mere security by obscurity".

"Oh, mwa-haha, they obscured their password amongst these millions of possible combinations, thinking it gave them security - how quaint. Thankfully I'm sufficiently capable.", she'll say.


The point is that the key is stored there too (part of it burned during production in silicone) and can't be read or changed.

Sure, if they wanted to they could implement a backdoor. But assuming they correctly created and shipped the secure enclave it shouldn't be possible to circumvent it even for Apple.


It's sounding like that's the problem. They left an opening for this sort of thing by allowing firmware updates to the secure enclave. That basically makes it a fight over whether the FBI can force Apple to use the required key to sign an update meeting the specifications the FBI directs.


Well, i read elsewhere in this thread, that updating the firmware for the secure enclave wipes the private key contained within. Which means you've effectively wiped the phone.


You need to unlock the phone before you can update the firmware on the secure enclave.


Secure Enclave is not really ‘hardware’; despite being isolated from the main OS and CPU, it is still software-based and accepts software updates signed by Apple.


If those software updates force it to erase its private key store, though, then it's functionally isolated from that attack vector. An updated enclave that no longer contains the data of interest has no value.


Ok, if that part is updatable you have indeed a backdoor.

In theory it should be possible to make it fixed (which Apple doesn't seem to have done).


Making it fixed just means you can't fix future bugs. The secure approach is to ensure that updates are only possible if the device is either unlocked or wiped completely.


The Secure Enclave enforces time delays between attempts.


Not possible without backing it into iOS with some update that FBI is claiming will be used on one particular phone in this particular case.

Apple reasons that there is no way to guarantee no one will take the same update and apply it to other iOS devices. Or government taking this a step further by making Apple to build that into future update for the whole user base.


It's not a backdoor to the phone only being unlocked by the passphrase, but a backdoor to the number of attempts limitation.


This limitation must be built into security hardware used by iPhone so software couldn't do anything about it. I was under impression that it's how iOS security model works. If it's not and in fact this check implemented in iOS itself, it's much weaker protection and it's really looks like an intended backdoor from Apple.


It sounds like it is built into hardware with newer iPhones containing the secure enclave, but not for an older phone like the iPhone 5C.


It's not really built into ‘hardware’, it's enforced by the Secure Enclave, which is software-based and accepts software updates signed by Apple. It's secure against kernel exploits and third-parties, but not against Apple.


I'm really interested to know more about this. Does TouchId secure enclave really enforce the password attempt limits?



Its really a pretty impressive design. Android phones are lacking here.


They didn't make any mention of how feasible it would be, just that they wouldn't even try because it would threaten the security of their users.


Let me correct it for you. "Perceived security of their users".

A lot of it about PR.


> If Apple could in fact write a software backdoor, doesn't it mean that the backdoor exists

Schroedinger's Backdoor? ;)


Only Apple has the ability to sign updates to software (barring jailbreak).


It sounds like it'd be trivial to nop out the timer on repeated passcode attempts. Which makes sense... Leaving any short passcode trivially crackable.


Yes but like where would you nop? You can't statically analyse the code because the image is encrypted at rest (and potentially partially in ram also?)


The code which decrypts the system (and is responsible for wiping the drive on repeated failures) is definitely not encrypted. How would it be able to take the input in order to decrypt the drive.


Yes but that code is all running in ram, precluding static analysis. You can still dynamically analyse it, but that is much harder.

The way I understand it (and, correct me if I'm wrong) is that the code flows from disk through the aes engine where it is decrypted and then placed in a presumably interesting/hard to reverse place in ram at which point it is executed. I imagine even more interesting things are done to higher value data in ram, but that's not code - because as you said, code has to be decrypted (at the latest) by the time it reaches the registers.


Their security PDF says that the system has a chain of trust established, anchored at an immutable loader residing inside the chip, and each step verifies the digital signature of the next step against the hardcoded Apple CA certificate.


So to clarify you're saying that you can't just nop instructions, or it's not as simple as that, right?


I agree. I was under the impression that Apple's security was such that even they didn't have the power to decrypt a device because the crypto made it impossible without the password/pin/key. I'm interested to understand the reasons that it was not done this way.


The current implementation is done this way indeed: even Apple cannot decrypt without using the right password.

The right password can be obtained by either knowing it, or by guessing it.

As an additional security measure, the software shipped with the phone prevents brute force attacks by wiping the device after a given number of failed attempts.

Apple has been asked to modify the software so that it won't wipe the phone, thus allowing the authorities to try many passwords.

If anybody could circumvent this additional security measure, actual security would be lower. The authorities are not asking Apple to ship this change to all users: they only want to install it on the device in their possession.

However, Apple is concerned that once they provide the authorities such a modified software, it could be leaked and thus be used by third parties to breach the security of any Apple device.

It should be noted that currently all data encrypted for example on your laptop's hard drive is already subject to this kind of brute force attacks. It's a well known fact that authorities or malicious users can already attempt brute force attacks on encrypted data if they can access the data on a passive device such as a hard-drive.

It's important to understand that Apple (at least not in this case) is not being asked to implement a backdoor in the encryption software. It's also important to understand that even if Apple was forced to install a backdoor, it would affect only the ability to access future data and not help the investigation of the San Bernardino case.

This very request by the Authorities suggests that Apple does not currently install any backdoor on stock phones.

However there is a logical possibility that the Authorities are either not aware of any such backdoor or in the worst case they are publicly requesting this feature just to hide the real reason of a possible future success at decrypting the phone: they can claim that Apple didn't have any backdoor, and they were just lucky at bruteforcing the device; in fact Apple wasn't even cooperating with them at relaxing the brute force prevention limit, so they could claim they did it in house.

(I'm personally not inclined to believe in such improbably well coordinated smoke and mirrors strategies, but they are a logical possibility nevertheless).


Why doesn't the FBI simply clone the current device, make brute force attempts and then clone again if locked out? Yes, lots of work but also doesn't force Apple to participate.


I thought @csoghoian's take was interesting. The FBI doesn't want one specific phone unlocked, they want precedent.

https://twitter.com/csoghoian/status/699841360963108864


The iPhone uses AES encryption which would prevent cloning the flash storage [1]. There was an informative discussion of this over on AppleInsider - http://forums.appleinsider.com/discussion/191851

[1] http://www.darthnull.org/2014/10/06/ios-encryption


from that discussion, super informative details on how secure enclave is actually implemented:

http://forums.appleinsider.com/discussion/comment/2832533/#C...


Would it be infeasible to construct part of the private key out of hardware-specific id's + time-of-creation hashes? I assume it's not only the PIN?


That's what the A7 (iPhone 5S and later) design does:

“Each Secure Enclave is provisioned during fabrication with its own UID (Unique ID) that is not accessible to other parts of the system and is not known to Apple. When the device starts up, an ephemeral key is created, entangled with its UID, and used to encrypt the Secure Enclave’s portion of the device’s memory space. Additionally, data that is saved to the file system by the Secure Enclave is encrypted with a key entangled with the UID and an anti-replay counter.”

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

The device in question is an iPhone 5C, which uses the older A6 design.


Thanks for the link! I knew there had to be more technical information out there but couldn't find it on an initial search.


Yeah, it wasn't exactly unknown before but it wasn't terribly common outside of certain security / compliance circles. I think I've seen more links today than in the previous year.


Each device has a device-specific AES key (UID) burned into it such that you cannot clone devices (or move flash chips between them).

Everything is encrypted with a derivative of this UID, and extracting the UID is not a thing you can do without destroying the device.


I'm really curious as I have been hearing this many times.

"even Apple cannot decrypt without using the right password."

Could you please explain?


You can read more about it in this well written article: https://www.mikeash.com/pyblog/friday-qa-2016-02-19-what-is-...


It sounds like it has been built that way - the data cannot be decrypted without the correct passphrase - but the user secured the phone with a 4-digit PIN, giving a mere 10,000 possible combinations - easily brute-forceable.

The iPhone prevents this by locking up (and potentially erasing the phone) after 10 failed attempts, but this a restriction created in iOS. If they provision a new backdoored version of iOS to the phone, that restriction wouldn't apply any more, and they could brute-force away.


To clarify, on A6 and earlier this is enforced in software. On A7 and later this is enforced in hardware.


I think they're being asked for a few things—private keys and the ability to automate and not limit passcode entry attempts.


But why would you even think apple, google or facebook would be a good bet to defend your privacy in the first place ? They got the most terrible track record of not caring about.

If you have things that you need to be private, don't put it on a smartphone.


I wish people would stop lumping Apple with Google/Facebook with regards to privacy.

Apple has implicitly for a long time, and lately much more vocally, cared about privacy. They don't have the same data-driven business model that Google and FB do.


> Apple has implicitly for a long time, and lately much more vocally, cared about privacy.

They say that. But with closed source software we can't verify that it's true. I'm not saying they don't care about privacy, only that we don't really know if they do or not.


With open source software, it doesn't appear that people can verify things are safe either given the long-term security issues with things like OpenSSL et al.


We found the bug in OpenSSL BECAUSE it was opensource. If it weren't, nobody would have seen it.

Plus, with open source you can verify intent, which you can't with apple.

Which provide a device getting your finger prints, all your phone numbers, internet search, bank details, some paiements, network communication, voice communications, text communications, localisation using GPS and wifi + hotspot + phone towers and soon ihealth device collection body metrics.

And they are profit oriented, not people oriented.


> We found the bug in OpenSSL BECAUSE it was opensource.

Sure but they were there for years before anyone noticed. Same with PHP's Mersenne Twister code. Same with multiple other long-standing bugs. It's disingenuous to toss out "Oh, if only it was open source!" because reality tells us that people just plain -don't- read and verify open source code even when it's critical stuff like OpenSSL.


I never said they could. There is a better chance that they can, but that line of thinking ends up trying to prove a negative.


Actions speak louder than words. The most revealing test of the strength of a company's commitment to privacy is how it handles situations when privacy can conflict with profits. Privacy on the internet relies critically on browsers only trusting trustworthy certificate authorities. When CNNIC breached its trust as a certificate authority last year, Apple sat tight waiting for the furor to subside (https://threatpost.com/apple-leaves-cnnic-root-in-ios-osx-ce...).


I would argue that handling security problems in general has not been Apple's strength historically.

I agree that failing to fix a problem like this in a timely fashion is bad, but sins of omission are generally judged differently than sins of commission, for better or worse. Apple failing to apply proper prioritization to security holes isn't the same as Apple collecting data to be sold to the highest bidder.

So, again, Apple should not be treated as equivalent to Google and Facebook. Feel free to judge them harshly, but don't paint them with the same brush.


"...and disabled Touch ID - which is hackable with a bit of effort to get your fingerprint."

As long as we're on the topic of encryption, phones, and law enforcement it's worth keeping in mind that in the US at least courts can compel you to unlock your phone with Touch ID, even though they can't compel you to give them a password. Communicating a password is considered speech, so self-incriminating speech is protected by the fifth amendment. Physically holding your finger to a device is not considered speech and so it's not protected.

I think this is an interesting, and perhaps underappreciated, aspect of a shift from passwords to biometrics for verifying identity. It would shift the power dynamic between civilians and government a bit - here in the US at least.

Of course, hopefully no one is in a situation where they need to protect themselves against over-reaching or unjust government officials any time soon.

http://www.engadget.com/2014/10/31/court-rules-touch-id-is-n...


When entering any questionable law enforcement situation (TSA, walking near a protest, traveling internationally) I always switch my phone to not use TouchID. Say what you will.


Absolutely. I'm quite amazed they had the guts to go through with this and I applaud it. I will support them with my dollars as much as possible.


Talk is cheap and these internet posts - from Tim Cooke, you, and me - are just talk. The security of this nation depends on Apple (and Google et. seq.) supporting us with its dollars as much as possible.

And I'm not optimistic that the stockholders care about anything more than doing the opposite.


Talk is not cheap when you have to back up such talk with a substantial legal defence.

And any stockholder with an ounce of intelligence will understand that Apple's choices here are both morally right and effective marketing. As the information age matures, privacy is becoming a valuable asset, and Apple is starting to gain a positive reputation in this space.


Totally agree. As someone who is currently an Android fan, I find that Apple's apparent commitment to privacy is appealing, and it could eventually mean I buy an iPhone. Assuming I'm not the only consumer who has made this connection, then Apple has very legitimate business reasons for continuing to trumpet their commitment to privacy.


>And I'm not optimistic that the stockholders care about anything more than doing the opposite

AAPL is one of the most mainstream, widely-held stocks on the planet. Assumptions about the opinion of some homogeneous "stockholder" are useless.

I'm positive HN is filled to the brim with AAPL shareholders who care deeply about this issue.


Some people already complain that the iPhone is too expensive, especially compared to non-equal Android phones. Not enough people complain about any company not fighting for their user's security.

Talk is all we need so far. Publicly saying no to the FBI is a step rarely taken.


I am but one data point, but I recently had opportunity as a stockholder in a fund to cast a vote on a shareholder led proposition. Voting in favour of the proposition would have removed a set of investment choices that the fund managers could pursue on our behalf. The fund managers (I forget the exact terminology of what they were - maybe the word "board" was used) recommended that shareholders voted against.

I voted for, and to my surprise so did a majority of other stockholders, and that avenue of opportunities was removed.


What it sounds like is they've been asked to prepare a new OS release that allows an unlimited number of attempts to enter the passphrase via some network link. The press release is written to sound like without a software release, it wouldn't be possible to mount this kind of attack, however attacks like this are generally possible regardless of having some specially modified and signed OS image: for example, by cutting power to the hardware precisely when it is clear a password was incorrect, before the hardware has time to implement any destructive actions. Attacks like this have been used against SIM cards since the 90s.

I'm ambivalent regarding Apple's stance. In principle they are doing the right thing, but in practice, it seems they may be kicking up a whole lot of fuss over a relatively minor issue (with the exception that providing an easy means to brute force a phone to the authorities sets a horrible precedent). As for creating a universal backdoor, it seems highly unlikely they couldn't produce a signed OS / coprocessor firmware image that wasn't locked to one of the various serial numbers associated with this particular device

edit: as mentioned below, this order entirely originates with Apple's use of DRM to prevent software modification. Had users actual control over the devices they own the FBI wouldn't need to request a signed firmware in the first place. Please think twice about what Apple might really be defending here before downvoting


> with the exception that providing an easy means to brute force a phone to the authorities sets a horrible precedent

This is the entire concern (in my opinion and in my reading of Tim Cook's opinion). If the government can force Apple to backdoor this one iPhone (because terrorist), then they can force Apple to backdoor any iPhone for any person given a valid warrant, subpoena or otherwise granted power. Once the flood gates open...


It's worse than that. There's no guarantee that "the government" is "your government".

Imagine this scenario:

1.) Apple creates the custom iOS build for the FBI to use to decrypt this iPhone.

2.) China hacks into either Apple or the FBI and downloads this build. (We know they have the capability, because it's already happened. [1])

3.) A visiting U.S. diplomat, politician, or military officer has his iPhone pickpocketed while in China. (This also happens all the time.)

4.) The Chinese government uses this stolen software to brute-force the encryption on the device, finding access codes for classified U.S. military networks. (Because we know U.S. diplomats never use their personal email for state business [2], right?)

5.) Now a foreign power has access to all sorts of state military secrets.

The problem with backdoors is they let anyone in. Right now, there's a modicum of security for Apple devices because knowledge of how you would bypass the device encryption is locked up in the heads of several engineers there. The FBI is asking Apple to commit it to source code. Source code can be stolen, very easily. Tim Cook's open letter is making the point that once this software exists, there is no guarantee that it will stay only in the hands of the FBI.

[1] https://en.wikipedia.org/wiki/Operation_Aurora

[2] http://graphics.wsj.com/hillary-clinton-email-documents/


> knowledge of how you would bypass the device encryption is locked up in the heads of several engineers there

WARNING — THIS Apple Engineer IS CLASSIFIED AS A MUNITION --rsa--------------------------------8<------------------------------------- #!/usr/local/bin/human -s-- -export-a-crypto-system-sig -RSA-in-3-lines-HUMAN ($k,$n)=@ARGV;$m=unpack(H.$w,$m."\0"x$w),$_=`echo "16do$w 2+4Oi0$d-^1[d2% Sa2/d0<X+dLa1=z\U$n%0]SX$k"[$m]\EszlXx++p|dc`,s/^.|\W//g,print pack('H' ,$_)while read(STDIN,$m,($w=2*$d-1+length($n||die"$0 [-d] k n\n")&~1)/2) -------------------------------------8<------------------------------------- TRY: echo squeamish ossifrage | rsa -e 3 7537d365 | rsa -d 4e243e33 7537d365 FEDERAL LAW PROHIBITS TRANSFER OF THIS APPLE ENGINEER TO FOREIGNERS


That is the whole point of warrants. If apple made iphone with unlocked bootloader it would have been really impossible.


It used to be possible to do what you describe (cutting power at the right moment) and there were even physical bruteforce devices built to implement this but it's all been hardened in iOS 8 and there are no currently known ways to bruteforce a passcode. Obviously there might theoretically exist a software bug to do so, but there's no information around and it sounds even FBI couldn't find it if it exists


Could you elaborate on how's that prevented? I'm quite curious. [Or did someone already explain in some other post somewhere in this thread?]

edit: self-answer: the following post seems to have an answer: https://news.ycombinator.com/item?id=11115579, although it seems to describe a newer device than the one in the case; but I was interested in how such protection is possible at all, so that seems to answer it for me.


> they may be kicking up a whole lot of fuss over a relatively minor issue

I strongly disagree. They are taking a stance in the debate about government mandated backdoors in software.


Did you read the release? They are up front that its entirely an issue about setting a bad precedent. Its completely and totally about the fact that it would be used over and over again, and nothing to do with the fact that is it possible. Your overly cynical stance on this is misguided, as you seem to not have grasped the information in the letter.


The court order says that the software must only work for the specific device in custody. Apple is not supposed to create a general tool.


If you believe that the FBI wouldn't abuse a tool like that after the past few years of coverage of the Security sector there is really very little hope for you.


Personal attack much?

The order says that Apple's exploit should only work on this specific, already existing device.


The problem is with the legal prescident this would provide. Although right now it's just limited to once specific use case, this ruling could and would be used in the future to require Apple (and other tech companies) to compromise security in ever increasing scope.


> sets a horrible precedent

That's a large part of the fuss!


Hypothetically, it should be relatively simple to prevent a power-off-dodges-destructive-action attack, by simply making the operation (incrementing and storing the attempt counter, checking the password) an atomic operation.

So they would still get 10 bites at the cherry, and sure, on the tenth, they could depower the phone and prevent the wipe, but if each attempt is persistently stored before the password-check is carried out, depowering the phone wouldn't give them any more chances.


The government wants Apple to disable the auto-erase after so many unlock attempts. Apple argues in this letter that with modern computing power, this amounts to a backdoor.

The details of the gov't request are in another story on the HN front page

https://www.techdirt.com/articles/20160216/17393733617/no-ju...


The encryption key used on the root filesystem is too hard to brute force. It's not based on some crappy password that a human created, it's some hash value stored in the hardware. In a scenario like that it is easy to create a key that would require all of the computers working till the heat death of the universe to crack.


They're not trying to brute-force the encryption key. They're attempting to brute-force the PIN lock on the phone. Currently they can't because the settings on the phone cause a timeout after each unsuccessful login, the phone wipes after 10 failed attempts, and the PIN cannot be accepted from anywhere except the device display. These are the security functions that the FBI wants Apple to remove. They want to be able to hook up a computer via the lightning cable to brute-force the PIN and give them access to the phone's contents.


Like you, I appreciate the sentiment - happy to hear Apple speaking up. However this shouldn't change how we use Apple products. I operate under the implication that the device is compromised from the factory. Closed source software cannot be trusted, good faith is not enough.


I can't disagree with your point of view; I merely point out that a world where everyone who wants a secure smartphone needs to own their own smartphone factory isn't a very practical world.


Only needs to have access to software source code, and presumably a way to verify the binaries match. Not an entire factory.


They also need to verify the hardware does what it's supposed to do, and the CPU doesn't have secret opcodes.


Considering how many massive, gaping security flaws have been found in Open Source software in the last year or so alone that have been in place for years or decades, I think we can say that open source software cannot be trusted either.


Open source software can be trusted as much as we can trust ourselves - and that's as good as trust gets. I'll take that any day over trusting one party with a vested interest in making money.


> here's hoping against hope that Google, Facebook, and Amazon get behind this

I would assume that Google, Facebook, and Amazon are already sending every single keystroke we type straight to all the three letter agencies pretty much in real time. (I also assume the same about Apple, so I'm not sure what to make of this open letter.)


Why wold Google and Facebook get behind this? They store their customers data in a way they can access and subsequently have to give it to persecuters when there's a court order


If government agencies has unfettered access to data, it undermines the reputation of these companies. For instance, in Europe, many companies and organisations are likely even more hesitant to use Google's services since the Snowden leaks. This is a large loss of potential revenue.

Google has been (even more) proactive about security and encryption since then, since part of their business model relies on trust.


> Why wold Google and Facebook get behind this? They store their customers data in a way they can access and subsequently have to give it to persecuters when there's a court order

Exactly like Apple. Or do you think that the emails in iCloud are not given to the prosecutors?


Not exactly like Apple.

Google and Facebook's core competency is using your personal data to sell ads.

Apple's core competency is selling you appliances. Yes, they wind up with some personal data because of the services they also provide, but it's far less valuable to them than Google or Facebook.


> They store their customers data in a way they can access and subsequently have to give it to persecuters

What has to do your post with this claim?


Yes exactly like apple. To quote the link: "When the FBI has requested data that’s in our possession, we have provided it."

The point I was trying to make is that Google and Facebook have direct access to all the data of their customers, and already provide access to government agencies. Contrary to Apple they don't safely store some data of their costumers safely on the device, which this case is about.


> The point I was trying to make is that Google and Facebook have direct access to all the data of their customers, and already provide access to government agencies. Contrary to Apple they don't safely store some data of their costumers safely on the device, which this case is about.

Your point is wrong regarding Google and smartphones if the smartphone is encrypted


There's a huge distinction between Google (Android) and Apple (iOS) though: Apple affirms they don't have your keys, and this case bears that out (else the FBI would obtain the keys via subpoena to Apple rather than asking the court for a circumvention tool). Google is ambiguous about whether they have your Android keys; they claim they don't, however if you forget your device password it is possible to unlock your device via your Google account on a PC[1], and that alone is telling. If this were an Android device, the FBI would have already unlocked the phone with a simple subpoena.

Beyond that, Google definitely has the keys to your encrypted backups on their servers, so access to the phone might not even be necessary.

[1] http://visihow.com/Recover_Android_Device_in_case_of_Forgot_...


> For either company to unlock the device without the owner’s permission the smartphone or tablet must not be encrypted, according to the report.[0]

[0] http://www.theguardian.com/technology/2015/nov/24/google-can...


From your link:

"The situation is different for Android. Google’s version of Android, which runs on most Android smartphones and tablets in the western world, only implemented encryption by default with the latest version Android 6.0 Marshmallow released in October 2015."

That version of Android is only on a handful of devices, not even a full percentage point of global market share. Even on Lollipop and older devices that do support encryption, it has to explicitly be turned on by the user. And once again, Google is not expressly clear that they don't have your encryption keys on Lollipop and lower; they only claim not to have them for Marshmallow devices. They definitely have the keys to your encrypted data on their servers no matter what, which can include complete backups of your device.


> That version of Android is only on a handful of devices,

And?

> Google is not expressly clear that they don't have your encryption keys on Lollipop and lower;

They have explicitly said that if the device is encrypted they don't have the key.

> They definitely have the keys to your encrypted data on their servers no matter what, which can include complete backups of your device

Source for that?


If you can still access your backups after changing your password, you don't control the keys.

In other words, if you can ever actually use something, it's probably not secure.


Was going to point out that you spelt "prosecutors"as "persecutors", but then realised you might have genuinely meant that spelling!


I'm not a native english speaker and that actually was just a typo.


Don't worry about it. Your misspelling is actually quite apt!


Because their SSL certificate authority is going to be the next one to be legally compelled to sign something.


I'm afraid I'm too skeptical to get the same assurances as you.

Apple accuses the FBI of playing language games with the term "backdoor", but I think Apple has done the same. The fact that they can push weak OS updates to a locked phone is the backdoor. This means that they can already comply with the court order, and they likely will. This letter covers them from PR damage.


I'm not sure you can draw the conclusion that Apple can push OS updates to a locked phone.

What Tim Cook wrote is that > "install it on an iPhone recovered during the investigation." > "the potential to unlock any iPhone in someone’s physical possession."

So the FBI has the physical phone already. They can deliver to Apple who can disassemble it and either use a JTAG/Flash programmer on an internal connector to manually write new software, or they could desolder the Flash holding the old OS and place a new one.

Both of these techniques are common enough in the embedded industry that I expect this is what Apple means. They probably can't push an OTA software update and force the install on a locked device.


The 5C at issue in this case does not have the modern secure enclave like the 5S and newer devices.

The newer devices run a special L4 kernel on the secure enclave. It is not updateable without providing the existing passcode. It enforces the attempt rate limiting and key deletion on too many attempts (if enabled). Special limited communication channels allow the CPU to talk to the SE. In production devices the SE has JTAG disabled. Encryption and decryption of the master keys happen inside the SE with its own private AES engine so even oracle/timing attacks on the main CPU are useless.

Why doesn't Apple just help hack this phone but wash their hands of newer devices and tell customers to upgrade? Because if the FBI and this court get away with using the All Writs act to compel Apple to write new software they'll eventually be forced to add a backdoor to SE-equipped devices too. Courts won't understand or care about the differences.

If the government forced them, Apple could insert a backdoor into the next major version of iOS or the hardware; then everyone inputs their passcode during the upgrade and the backdoor is deployed. Their primary defense against that so far (and the only real one you can have as a corporation) is to never build the capability in the first place. This judge's order is telling them to go build the capability (in theory for this one phone). The fact that you can't retroactively build the backdoor for 5S and newer devices isn't the main issue.

Better to fight every step of the way and draft as many pro-privacy people as possible into the fight to apply political pressure.


> Because if the FBI and this court get away with using the All Writs act to compel Apple to write new software they'll eventually be forced to add a backdoor to SE-equipped devices too. Courts won't understand or care about the differences

The whole point is that it doesn't matter what the court thinks if Apple cannot comply due to the laws of nature. That was their whole argument to begin with. Their argument now is pretty mushy in comparison.


"I'm not sure you can draw the conclusion that Apple can push OS updates to a locked phone."

The iphone contains a sim card.

A sim card is a complete, general purpose computer with its own CPU and RAM and the ability to run arbitrary java programs that can be uploaded, without your knowledge by your carrier.

You are owned. Deeply, profoundly, in ways that you have no way to manage/mitigate.

The real question, for me, is why authorities are dealing with Apple at all and not just working with the carriers who have proven to be their trusted allies.


You are owned. Deeply, profoundly, in ways that you have no way to manage/mitigate.

The international legal framework of sovereignty basically says you are owned. (Not universally de jure, but pretty much de facto.) Whatever rights you have are effectively granted to you by your country. Unfortunately, this notion is seldom given any thought, and the current most visible proponents of such an idea are unpleasant angry underclass men using it as an excuse to behave badly. There are others who have given thought to this, however, and it is part of the motivation behind such things as The Universal Declaration of Human Rights.

https://en.wikipedia.org/wiki/Universal_Declaration_of_Human...


I think the answer to your question is embedded in your assumption: that updating the SIM card would be sufficient to recover data from this iPhone.

In my experience, law enforcement does not make their own jobs harder on purpose. If there is an easy way to get that data, they would use that way to get it.


With a sensibly-built phone, that SIM card does not have the ability to access anything of value on the device.


Is there a list of sensibly built phones available? I'd like to buy a phone where the modem and SIM do not have access to main memory (AIUI most phones use a single-chip SoC with a built-in modem).


What's the point of accessing main memory in a locked and encrypted phone?


Main memory is rarely encrypted, unless you have special security features in your CPU to do so. Only the disk is encrypted; main memory is vulnerable while running. Also see https://en.wikipedia.org/wiki/Cold_boot_attack

So you don't want any hardware to have access to main memory if it doesn't need to. For instance, you can use an IOMMU to ensure that devices can only access the specific areas the OS wants to allow them to DMA to/from, not all of memory.


> What's the point of accessing main memory in a locked and encrypted phone?

The phone isn't always locked and encrypted; for example, whenever the user is using the phone it's unlocked and decrypted.


The iPhone, for one.


> why authorities are dealing with Apple at all

I'd guess "security by obscurity". Just because they have the device rooted via SIM card doesn't mean they have available a signed build of a multi-gigabyte OS with most security libraries expunged.


One possible answer to your "real question":

I want to disclaim that this is pure speculation. I have no insider knowledge or indeed any particular familiarity with the institutions in question.

The FBI may want this authority and this precedent and think that this is a good chance to get it. They may say, "Well, the San Bernadino case is a high-profile case that may sway people, including judges, who would otherwise be less inclined to back our request. Who knows when the next nationally-publicized case will be in which the likely perpetrator carries an iPhone?" They may also believe that the current political climate is good for their case.

And they probably also believe that there's no harm in trying. If the courts rule against them, they haven't lost anything. If the courts rule for them, they get a brand new tool.


A sim card gets to send messages to the baseband in response to requests from the baseband. It doesn't have arbitrary memory access unless the baseband has really nasty bugs.


They need to break the boot trust chain to load unsigned code. Simply rewriting the flash isn't enough.


Why would the code be unsigned? If Apple wrote the backdoor OS, they could presumably sign it.


I incorrectly totally misread the OP and thought was talking about FBI flashing it themselves, without Apple help. Yes, of course Apple can sign it. I stand corrected but can't delete my comment.

To clarify, I agree that nothing they ask of Apple is technically impossible or even that difficult for Apple to pull off, probably via simple DFU without touching the flash at all.


Hence the need for a validbOS from Apple.


why load unsigned code? can't apple sign it?


Or anyone else with the apple key such as the NSA.


The judge wouldn't need to ask Apple if that were the case.


Or the NSA doesn't want to reveal that they have Apple's code signing key.


True, but a judge wouldn't allow the NSA to decrypt with a stole key either.


That wouldn't be valid evidence in court.


Pretty sure you can upgrade the OS on a locked phone if you have physical access to it.


Negative. You need the passcode.


If you lose the PIN on an iPhone you need to do a wipe and restore it from backup. You had better hope you remembered the backup password. You can't make a backup of a locked phone either.

The backup is probably easier to attack if you have it, since it doesn't have hardware imposed timeouts on password guesses. It may not be current however.


I'm not sure the backup would be much help even if you could break into it, I believe it's encrypted with the same method used to protect the keychain and is tied to the victim's Apple ID. I attempted to help a coworker restore their device with my Mac and couldn't because my iTunes was using a different Apple ID than the device and the device's backup.


There is no way to recover a phone if you lose the passcode?


If you have access to the iTunes account you can do a physical backup with iTunes and then erase and restore that backup. It won't be pin protected.


How would one connect the phone to iTunes to do this? You must enter the PIN on the phone to connect to iTunes iirc.


Not to do a backup and restore.


Not even sure what you mean by "iTunes account"?


Errr, sorry I think it's called iCloud now :P


Apple ID


Not that I know of. Like others have said you can pull a backup (but the machine backing up had to have been trusted prior or you're SOL) and then restore the phone.

Nothing's bulletproof but the iPhone is the most trustworthy IMHO.


I believe this is to deter theft.


I've locked out my galaxy s6 using the wrong password and it just wiped and reinstalled by itself. Then allowed me to restored everything that was on cloud backup.


Nope. I've been there, did resonable research and had to start over.


+1. If it is possible to push software updates to a "locked" phone then is this not tantamount to remote code execution with root privileges, and hence the BACKDOOR ALREADY EXISTS?

"Locked" seems like an improper term for such a scenario.

I applaud apple for appealing this case to the public however there is a HUGE HUGE difference between "we can't unlock" and "we shouldn't unlock". This distinction will likely be lost on the general public unfortunately.


Nowhere in this letter they say that it's possible and it seems very carefully worded to avoid stating that. They say, if it were possible they wouldn't do it anyway. That's an important legal and moral distinction.

To be fair, they could have stated it explicitly.


It's stated very clearly that they can push an update to an already existing device that would make it possible to retrieve "encrypted" data from said device.

If the data was truly encrypted, the concept of pushing an update or creating a master key would not be possible.


They state that they can push an update that makes brute-forcing possible by disabling software-enforced delays between attempts.

Apple's security PDF says that the iteration count is calibrated so that one attempt takes 80ms in hardware, so that's the hard limit on the brute forcing speed, regardless of any updates Apple releases.

This means that a long alphanumeric passphrase is secure, but a 6-digit passcode could be broken in half a day, and a 4-digit passcode would take just a dozen minutes.


It's so weird how hard it is for the brain to handle exponential growth. I was amazed that a 4-digit password can be cracked so quickly at 80ms a pop, but you're right. Just for the hell of it, here's how long it would take for different length passcodes for digits, digits plus letters (case insensitive), and digits plus letters (case sensitive):

    # characters  [0-9]         [0-9a-z]            [0-9a-zA-Z]
    1             0.8 seconds   2.9 seconds         5   seconds
    2             8   seconds   1.7 minutes         5.1 minutes
    3             1.3 minutes   1   hour            5.3 hours
    4             13  minutes   1.6 days            2   weeks
    5             2.2 hours     8   weeks           2.3 years
    6             22  hours     5.5 years           140 years
    7             1.3 weeks     200 years           9   thousand years
    8             13  weeks     7   thousand years  550 thousand years
    9             2.5 years     260 thousand years  34  million years
    10            25  years     9   million years   2   billion years


Does this consider the "too many incorrect attempts" lockout that iOS imposes though?


Nope, it's just 80ms multiplied by the number of possibilities.


According to Snowden, the NSA can brute force at the speed of over a trillion guesses a second, of course, they would need to be able to disable other security features first.


Ok, I read the entire thing once again. Nowhere in there do they state that they can comply with the request, only the consequences that would result if it were possible. In fact they say they "even put that data out of our own reach".

If it is stated very clearly, can you quote me a sentence?

In the security guide linked here it seems possible for this iPhone model but not later ones.

Edit: According to the discussion below Apple can ship updates to the secure enclave. I don't know if that's possible to a locked phone.


The word "push" appears nowhere in their letter. There is no way (in evidence) of "pushing" anything to the locked phone OTA. Physical access as a requirement? Sure, there's likely some way to get something onto the phone. But any DFU or JTAG-enabled update (likely the only vectors available on a passcode-protected device) would not be able to gain access to any appreciable fraction of the data on the phone, since doing so would invalidate the keys.

I wouldn't be surprised (It isn't stated in their iOS security doc) if the key generation uses a hash of the system files as part of a seed for the entropy source used for keys, though that's pure speculation on my part.

Edited for clarity regarding "push" vs physical access.


No, they state clearly that this is what the court ordered them to do. That doesn't mean it is possible. The court doesn't care whether something is possible or not.


That doesn't sound all correct. Assuming the phone holds an encryption key that can read/write local data, a software update could simply command it to decrypt all data and save it as a copy.


A device containing an encryption key that's just protected by a software password check would be absolutely useless. Part or all of the encryption key (maybe even the IV) is derived from the phone passphrase, this is why you can't just pop the NVRAM off a phone and try to find the key.


yes this is very interesting!


No. The word "remote" is not applicable to an attack that only works with physical possession of the device.

As far as I'm aware there is no known technique to prevent someone with physical access, a bunch of engineers, and the code signing keys from replacing firmware.


"locked" is a relative term. Anything encrypted can be broken with enough effort. But that is the semantic difference between leveraging a back door and brutally busting open the front door. I want a device where there is no back door. I hope you can appreciate that difference.


The iPhones with an A7 or later CPU should be secure against this. This whole thing is only an issue because the phone in question is an iPhone 5C, which uses an older CPU without the "secure enclave" system.


If it only applies to older iPhones, why did Cook write, "this software ... would have the potential to unlock any iPhone in someone’s physical possession"? (emphasis mine)


Because it's likely that it wouldn't end with "unlock this 5C" -- it would eventually extend to the government forcing Apple to either stop providing the additional security features in its newer models, or find ways to ship something that looks kinda like the security feature but isn't really.

Drawing the line in the sand at "the government can't force us to hack this guy's phone this time" thus ends up being "can't force us to provide features to hack anyone else's phone down the line".


I don't know. Either Cook is confused or I am. From everything else I've read, it's Cook. If I'm the one who's confused, then Apple really dropped the ball.


Would you expand upon this?


Sure thing.

Starting with the A7 CPUs, the iPhone CPU has a "secure enclave" which is basically a miniature SoC within the SoC. The secure enclave has its own CPU with its own secure boot chain and runs independently of the rest of the system. It runs a modified L4 microkernel and it does all of low-level key management.

The secure enclave contains a unique ID burned into the hardware. This ID can be loaded as a key into the hardware AES engine, but is otherwise designed to be completely inaccessible. Assuming AES is secure, that means the key can be used to encrypt data but can't be extracted, not even by the supposedly secure software running in the secure enclave. This key is then used to generate other keys, like the ones used to encrypt files. That means you can't extract the flash memory, connect it to a computer, and then try to brute force it from there. Or rather you can, but you'll be brute forcing a 256-bit AES key, not a 4-digit PIN, making it effectively impossible.

One of the secure enclave's tasks is taking a PIN (or fingerprint) and turning it into the encryption key needed to read the user's files. The main system just hands off the user's code to the secure enclave, and gets back either a key or a failure. The escalating delays with successive failures and wipe after too many failures are both done in the secure enclave. That means that updating the device's main OS won't affect it.

All of this is discussed in Apple's security guide here:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

The one open question is software updates for the secure enclave. According to that guide, its software can be updated. Does that mean it can be updated with code that removes the restrictions and allows brute-forcing passcodes? The guide doesn't address how the updates work.

My guess, based on how meticulous Apple is about everything else, is that updates are designed to make this scenario impossible. The secure enclave must be unlocked to apply an update, or if updated without unlocking it wipes the master keys. This would be pretty simple to do, and it would fit in with the rest of their approach, so I think it's likely that this is how it works, or something with the same effect.


So after having asked you to expand upon the topic, I ended up reading a few articles on the matter. While they were all very thorough and informative, your summary above was by far the clearest and most succinct. Thanks very much!


Thanks for saying so, and I'm glad you found it helpful. Maybe I should expand this into a real article of my own.


Late reply, but yes, I think you could totally do so. You already have a lot of material off which to build.


You probably meant "this is tantamount". Otherwise, I don't understand what you're saying. Are you claiming the backdoor already exists, or that it does not?


Parent was asking a question, "is this not tantamount to x?" With the expected answer being "no, this is not not tantamount to x" which reduces to "yes, this is tantamount to x."

In this case, the intended answer/conclusion/implication is "yes, this is."


Freedom from forced update is one of Stallman's motivations for free software.

Just one terrorist attack + PR letter to customer + forced update away from loosing encryption on your phone.


On the other hand, if iOS were open source / the iPhone was able to run unsigned code there would be nothing stopping the FBI from just doing what they've asked apple to do themselves (assuming it's actually technically possible).


Open source doesn't preclude securing the boot chain. TianoCore implements UEFI Secure Boot, and it's BSD licensed. I think the bigger issue is the (unreasonably) low trust in things like secure elements and TPMs in open source, but that needs to change, and is rapidly off topic.

More on topic is whether Apple or even Google get out from under this if their on disk encryption mechanism is open source. If everyone owns e.g. LUKS (in a sense no one company owns/controls it) then can any one company be burdened by a court to effectively break everyone else's software by being told to create a backdoor?


In that case the FBI don't get a warrant and compel a company to comply, they put out an RFP and pay a contract company to comply.


It may be that only the 5C or older devices have the ability to push a custom OS update to a locked device.

The real problem is that you don’t want to set any precedent at all. Once it’s possible to do something for the 5C, weasel words can be introduced to make claims like “well: now you must maintain the current level of access by law enforcement”. Next thing you know, that excuse can be used to interfere with all future hardware designs.


And my understanding is that everything after the 5C is less vulnerable to even this attack (Which itself may not be possible, even with a firmware update.)


From what I heard later revisions won't accept a firmware update without either providing the passcode, or erasing the private key.

Arguably, the fact that the 5C accepts a firmware update without the passcode is a security vulnerability and ought to be patched.


The hardware required for enforcing this is not found on 5C or older devices.


This isn't the case with the newer devices though, where the delays and key destruction are enforced in hardware, and tampering would destroy the key.


It's not just PR. It's actually really hurting their company. Weaker security means less data can be stored on the iPhone which means less need for it, which means fewer sales.


Thank you for answering one of my questions: "Why does a multi-billion-dollar company give one lick about personal freedom ?".

Companies exist to make money, not to protect our rights. It even crossed my mind that the possibility that NSA et. al. rooted these devices long ago, and that this whole "debate" is just a staged thing to make it appear as though we had any privacy and feigned adherence to the democratic process.

But your point that, as a matter of policy, many organizations will simply not use the product if they know it has backdoors, relates it back to their capitalist motivation and makes any conspiracy less likely.


Apple is also looking to foster growth in foreign markets. That's likely going to take a hit if the U.S. Government has special access to the phone.


Same here. I find this doubly reassuring.


I thought the same thing when I read it. If Apple can do this, then it's already a backdoor, and all the publicity of "Even Apple can't hack your phone" was false. But it turns out this is an older 5C, not the new 5S and up. So those old phones were effectively backdoored.

I agree that Tim Cook has spun it as if it's not a backdoor when clearly it is. Still quite a hard-to-use one though. It seems like they need the physical phone and maybe Apple's private key for signing updates.

The idea that the act of writing software makes it insecure is silly though. The security doesn't come from no-one having made the appropriate changes to iOS. If it was, that would be security through obscurity and any motivated hacker or the FBI could modify iOS themselves. It must be about signing the update so that it can actually be installed.


I found this article about this [1] to be rather enlightening.

1) http://blog.trailofbits.com/2016/02/17/apple-can-comply-with...


I'm afraid that I have to agree with you here. But Apple (and the FBI) recognize that this is the first salvo in a battle that will rage for MANY years. These early skirmishes could dis-proportionally affect the outcome - hence the ensuing PR battle.


This is not apparent from the article. Where could I find out about this ability?


If such an ability doesn't exist than how does this potential threat follow?

(article) > But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.

I am also missing a key part of the technical details involved in this this situation.


I'd also like to point out that Apple has a record of not practicing cautiously to security exposure, like _the inclusion of CNNIC root certificate despite the public exposure of security breach_.

http://apple.slashdot.org/story/15/04/09/1531237/apple-leave...


Exactly. Plus, all backups of the iphones on icloud are already unencrypted, so half of the phones are already indirectly unlocked.

EDIT: backups are encrypted, but apple have the keys. See below.


iCloud backups are 100% encrypted. Only some iTunes backups are not, and only if the option to use encryption to protect the backup is not selected.


They are, but apple have the keys : https://thehackernews.com/2016/01/apple-icloud-imessages.htm...

So basically, they could be in clear text, it's pretty much the same.


Please see this link[1], Apple explains exactly how keys are stored in their datacenters (Hint: it is not in clear text). They use HSM's which destroy the user's key after 10 failed attempts.

[1]: https://www.apple.com/business/docs/iOS_Security_Guide.pdf


This is only for Keychain escrow, device backups are not protected by HSM's.


>The fact that they can push weak OS updates to a locked phone is the backdoor.

This times nine hundred and eleven thousand.


Let's get behind Google, Facebook and Amazon to protect our privacy?


Let's get behind Google, Facebook, and Amazon in their desire to not to hand over our data to the government.

Even with as much data as each of them has, it's still better that the data isn't given to yet another party (the government, or each other).


Is it "our" data? Or is it their data? You don't think they share this with their business partners?


What this says to me is that the current iPhone encryption is able to be broken by Apple. Note: "[the FBI is asking us to] make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation." I interpret that as there is already a backdoor, but one that is controlled by Apple. If that is the case, the gov't is asking for dual control of that pre-existing backdoor in addition to Apple. If I encrypt a message with a public key, and Phil Zimmerman can decrypt it (in addition to the holder of the private half of that public key), then there's a backdoor in PGP, regardless of who owns/knows about it. The only secure system is one where the data is well and truly unrecoverable if the private key is gone, missing, or unavailable/unwilling to be applied by the owner.


Does Apple do an amazing job protecting their users' privacy? Yes! But frankly in this case I find the FBI makes more sense than Apple.

Apple says:

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission

As I understand, Apple complains about the introduction of this new threat model:

  1. criminal steals someone's iPhone,
  2. gets hold of a special iOS version that Apple keeps internally to assist the government,
  3. pushes the OS update to the iPhone by themselves,
  4. uses some tool to automate brute-forcing the user passcode
At least that's what I get from these excerpts:

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

Now how serious is this threat? And how useful is it to have the FBI be able to look into a phone when they have a warrant? Does the balance between the right to privacy and the need to assist criminal investigation really tilt towards privacy in this specific case?

Meanwhile there are serious threats that do affect a lot of users in practice, where Apple does a good job but could do better still such as:

- Remote code execution on iOS (6 vulnerabilities in 2016 so far[1])

- Phishing, brute force and social engineering attacks (the improved two-factor authentication is not yet available to everyone[2])

Am I wrong in thinking that users are way more likely to be affected by these threats except when the government has a warrant?

If Apple is actually worried about the FBI getting access to the modified iOS version, they should focus their complaint on that and propose to do the whole data extraction in-house.

[1] http://www.cvedetails.com/vulnerability-list/vendor_id-49/pr...

[2] https://support.apple.com/en-us/HT204915


I think the question is less about whether or not a "correct / valid use" of this new technology is acceptable. The problem is that it's impossible (once it exists) to guarantee it won't be used in malicious ways.


Right, but isn't "impossible" too high a bar? Or is the ability to comply with a warrant worth nothing?

It's not like iOS is impossible to hack now and it would be terrible that it becomes possible. There are other aspects of the system that allow malicious exploits more easily than this theoretical threat. So it doesn't make sense to preserve at all costs (e.g. making warrants unenforceable) an "impossibilty" that never was.


> Right, but isn't "impossible" too high a bar?

No, not really.

> Or is the ability to comply with a warrant worth nothing?

What if I issued a warrant for you to give me a 3 headed dog? Is your inability to comply with a warrant worth nothing?


Important to remember: Anyone who has your phone also has an object covered in your finger prints. Don't rely on Touch ID for actual security.


Important to remember: All iPhones since the 3GS have oleophobic surface coating (which combined with cloth pockets means the screen stays relatively clean). Also important to remember, fingerprints on a random surface are VERY hard to copy onto latex milk or PCB substrate.

Please continue to think


> Huge props to Apple

They may be doing it for people right to privacy, but don't forget they might also be doing this because their image would become tainted irrevocably if they complied with this. Trust in Apple devices would be shattered (across those who currently trust Apple).


What puzzles me is why the government didn't impose a gag order like they sometimes do in cases like this.


Because a gag order would have to be absolutely legally watertight, which is likely impossible in this case.


> Huge props to Apple - here's hoping against hope that Google, Facebook, and Amazon get behind this.

Why against hope?



I know that it is an idiom, I asking why he has no hope


From the link that was posted in reply to you, 'hope against hope' means 'to have hope even when the situation appears to be hopeless'. Which means he does have hope, even though the situation appears to be hopeless.


Why is the situation hopeless? Google, for example, has been undertaking efforts for years to improve security on the Internet, in part for the purpose of protecting privacy. Seven days ago they were featured in an article about how they're going to warn users of Gmail whose incoming emails are not protected by SSL: https://news.ycombinator.com/item?id=11067050 - this is part of their Safer Email initiative. Some time ago they released a Transparency Report as part of that initiative describing which senders to/from Gmail support encryption: https://www.google.com/transparencyreport/saferemail/ - and that's not to mention their efforts promoting HTTPS.

I see these efforts as Google going far out of its way to support privacy and security on the web.


I think Oletros understands that perfectly, but is wondering why OP feels that the situation appears to be hopeless.


Yeees, I'm asking why he thinks there is little hope that they will stand


Sorry, misinterpreted you, in over-literal mind mode right now!


No problem


When a US senator expressed upset with Wikileaks, Amazon shut them down. Not even a court order, just a tantrum from a politician. If you really think Amazon aren't cheerfully inviting the US government into their datacentres, you haven't been paying attention.


[deleted]


You should read your link, because this is not what it says


You're right, the headline is sensationalizing his actual words. I'm retracting that comment.


Google worked with the NSA and Facebook not only worked with the NSA, but now has ties to China and the Chinese government (which have the most advanced spying and censoring Internet technology around).

So, I'm doubtful.


it's probably pretty safe to assume google did the opposite


Isn't it odd that this press release spends so much time discussing encryption? Neither the FBI nor Apple will be touching any encryption functions of the device.

It seems the primary aim of this statement is to prevent rumors that would spook Apple users into thinking their data is fair game


When I forgot the password for my iPhone I tried untill it said "connect to ITunes" when I connected to iTunes nothing happened so I clicked restore realised I had not backed up my iPhone so I clicked backup hoping that it would backup but it said that I needed to sign in this then relied the boundary of not letting me type the password in or holding me back. So I have created a program that repeatedly does that. (There are no need for fancy backdoors)


What do you want thone other companies to get behind? They don't manufacture phones like Apple does, right...?

Note that this letter says: "When the FBI has requested data that’s in our possession, we have provided it."

Seels like that detail is getting very little attention in this announcement. Really? If the FYI requests any data, they hand it over...?


Would you complain about Apple handing in information to solve the murder of a loved one? Why is it always the "bad government" argument? It's not that it doesn't happen, but usually those requests are aimed towards more "mundane" cases.

For example, I have friends who work in law (though not in the US), and the number 1 data request -which is revised by a judge, and only then given by companies- are call logs from telephones (just from/to, date, time, duration, nothing fancy). And this are extremely helpful and information rich, if you know how to use them.


I'm all for supporting law enforcement efforts, but the wording seemed to imply that Apple hands over any requested info to the FBI, which seems excessive.

But I was mainly pointing out that quote because it wasn't clear what lesson the parent commenter wanted Google, Facebook, and Amazon to be learning from Apple. I would have guessed it'd have something to do with protection of user data, but the letter says they turn over any user data they have!


They probably refer that they have handed all the information they could in this specific case. There surely was some kind of authorization (from a judge perhaps -- US law is almost unknown to me) too.

As for the comment of Google, Facebook, etc, learning, I agree with you.


Can the information bring my loved one back to life? Or will it just result in more terrible things happening to someone else's loved one?


Utilitarian philosophy argument. The benefit gained by solving one murdered loved-one case is badly offset by the loss accrued to everyone by no longer having the security protections they expected.

This specific case, in fact, is pretty close to "murder of a loved one;" the phone's owner killed people, and the FBI wants to find out if they were part of a bigger plot.


Goole makes Android; Amazon makes FireOS. No idea what't the deal with Facebook is.

> Really? If the FYI requests any data, they hand it over...?

They have to, there's no legal wiggle room here. Creating backdoors OTOH seems to be sufficiently legally questionable that Apple can risk noncompliance.

(Of course, Apple could not accumulate all that data in the first place, but that would be silly, obviously. Now please sync your wifi passwords to iCloud.)


... you can't just ignore a subpoena if you wish to do business in the US. providing a backdoor is different


With Google's Android, this issue will never arise because Android is open source. Any attempt to plant a backdoor will be outright monitored by the community.


Not too sure about this. Keep in mind that in most commercially sold Android phones, closed source, self updating, Google Play Services is installed by the manufacturer with system level privileges. That alone is enough to create a non insignificant back door.


Or, said differently, Play Services are already a backdoor. They can (and do) install updates or other software pushed by server automatically, without you being able to do anything about it. And they have access to anything on the phone.


And it can reportedly grant itself new permissions without the user's knowledge, bypassing a fundamental security mechanism of Android (any Android devs know how this is done?). http://arstechnica.com/gadgets/2013/09/balky-carriers-and-sl...


Google Play Store manages the permission dialog for apps installed via Play Store (Play Services is one). It just doesn't show it for Play Services and just auto-accepts installation.


F-DROID


But you can always root your phone and install Cyanogenmod from scratch, right? Agreed that this is not too trivial right now because of standardization issues, but it could be done if you really care about privacy.


Not all of CyanogenMod is free software (you still have a bunch of binary blobs, and everyone has to use Google Play Services anyway because every app seems to implicitly require it). Replicant would be a much better alternative if it actually supported anything newer than 2G.


> and everyone has to use Google Play Services anyway

This isn't true. You can stick to app repositories like F-Droid and use Raccoon to download Play Store apps via your desktop without using a Google account on your phone.


It is true that you can get Play Store apps without a Google Account, but the Place Services framework does a lot more than this. Many apps rely upon the framework for certain pieces of functionality from Google's libraries.


You mean Google Apps specifically?

I don't use them personally but I imagine Goole Now, GMail and Google Maps would need Play Services.

The apps I do use (non-google) tend to function well enough without Play Services though.


Many Android apps include the Google Play Services framework in themselves, as they provide extra functionality that is not in the baseline Android API, e.g. a JSON parser.


Fdroid is wonderful. They even strip ads from otherwise foss projects since the licenses are usually not compatible.



Nobody builds their Android from source. Nobody uses Cyanogenmod. And nobody runs Android on a phone where the entire stack is open source and blob free.

Anyone who does is a rounding error.


> And nobody runs Android on a phone where the entire stack is open source and blob free.

> Anyone who does is a rounding error.

I'm actually curious if there is literally anyone who uses no proprietary software, including the radios and the SoC, on their Android device.

My bet is that there's not even a single device out there for which this is possible. (If there is, I'd love to see it.)


Not quite, but you can come close. I have Cyanogen installed on all my Android devices and I try to use as little proprietary software as possible. However I am patiently waiting for the Neo900, which is a free (libre) hardware design based on the Nokia N900: https://neo900.org/

According to them, there are unfortunately no baseband modems on the market that can legally have their firmware distributed as free software. Their workaround is to keep the modem as isolated from the CPU/RAM as possible.


> According to them, there are unfortunately no baseband modems on the market that can legally have their firmware distributed as free software.

What is the legal restriction here? (It sounds like you're referring to some restriction beyond simple copyright protection on some of their components - are there FCC regulations regarding the firmware?)

EDIT: Ah, of course, the FCC needs to certify devices before they can actually be used.


From their FAQ: https://neo900.org/faq#peripherals

>We unfortunately cannot provide free baseband modem firmware, as there is no option available on the market which would be able to fulfil this requirement. Even if it existed, it would bring very little value to the users, as operating a radio device with modified firmware on public networks without recertification is prohibited in most jurisdictions of the world and privacy concerns in cellular networks are mostly related to what happens on the network side, not inside the device.

I don't have any more information than this. If someone can quote specific FCC regulations to back this up, I would find that very interesting :)


> Even if it existed, it would bring very little value to the users, as operating a radio device with modified firmware on public networks without recertification is prohibited in most jurisdictions of the world and privacy concerns in cellular networks are mostly related to what happens on the network side, not inside the device.

Not entirely true, publishing the code isn't the same as allowing its modification. Code signing can be used to limit which versions are allowed to run.

Reproducible builds of the source would allow one to ensure that the binary, certified version of the code their baseband processor is running is legit (i.e. not backdoored). It would also help audit the code and spot security holes.


> Not entirely true, publishing the code isn't the same as allowing its modification. Code signing can be used to limit which versions are allowed to run.

If I can't run my home-compiled versions of your code - whether because of code signing restrictions or because of federal law prohibiting firmware that hasn't been certified - it's not free[0]. So without without reproducible builds, providing the source code for the firmware provides very little benefit (since I have no way to prove that the code corresponds to what's actually running on the device, nor any legal way to install and run it on the device myself.)

Reproducible builds could in theory work, but actually getting builds to be bit-for-bit reproducible is not an easy feat. I'd be very surprised if firmware were capable of this.

[0] This is a great example of why a free software license doesn't necessarily mean that the software is free. It means that the author has waived his/her ability to restrict your freedom to use/modify/distribute the software, but that doesn't mean that third parties (ie, the government, or a patent troll) have done the same.


Not free, but open. I'd argue the latter is significantly more important than the former if you're trying to protect against the code working against you. At least if the code is open, you can inspect it and verify its operation.


Although replicant currently can't do free software on the modem and bootloader, everything else is: http://www.replicant.us/supported-devices.php


That's the current state of affairs. I know of no baseband manufacturer who has ever offered (nor seemed open to the idea of releasing) source for their chips.

Basebands aside, the rest of the device is somewhat feasible to see being open.


> including the radios We no for a fact that they don't because that's illigal in the US.


It's not possible because pretty much all baseband processors (radios) run nonfree software.


What's the closest one can reasonably get?


Not necessarily.

Despite the openness of Android/AOSP, there are still, unfortunately, things like binary blobs for certain graphics chips and closed-source firmware for things like Wi-Fi chipsets. Given what we've seen agencies like NSA are capable of (intercepting hardware in transit to apply backdoors, paying off RSA to make Dual EC the default pRNG in their crypto libraries, etc.), them compelling a manufacturer of a component to include a backdoor in their closed-source blobs is no stretch of the imagination.

Apple even has this problem: basebands in cellular modems are notorious for being the source of exploits in otherwise-secure phones.


I'm surprised that nobody on this thread has commented on the real substance of this response. It has nothing to do with Apple brute forcing iPhones for the police (which it has done for years, with a simple court order) - but instead, is Apple making it abundantly clear, that if they comply (or are forced to comply) with the All Writs Act of 1789 to create this particular back door, then that opens the floodgate moving forward for all sorts of requests to add backdoors/decrease security.

It's entirely possible, that the FBI can then use this precedent to simply have Apple remove all security from an iPhone in pursuit of an active investigation, which can be done with a straightforward firmware update - which IOS users tend to do without much thought.


> Apple making it abundantly clear, that if they comply (or are forced to comply) with the All Writs Act of 1789 to create this particular back door, then that opens the floodgate moving forward for all sorts of requests to add backdoors/decrease security.

I read it differently. Apple is saying that if they make this particular backdoor, then this very backdoor can also be used in other scenarios, to crack other phones (i.e. the backdoor would apply to all iPhones C, not just to this one).


I interpret as the OP does. The court document asks Apple to lock the particular image to a particular serial number, so if all goes according to plan, the same image could not unlock other iPhones. Obviously, writing security code on a short deadline does not make for the best security, so that's one worry.

But Apple's letter uses the expression "technique", which I think means they're worried the government will get another court to make them change the serial number and sign a new image "next time". Before you know it, Apple will have to have an entire department to make these one-off images. Someone will say, "you know, you could save yourself a lot of time if you just made it work on any phone." Then that image will be leaked, and their security guarantees will be dead. (One might also worry about the DRM implications.)


You and OP are both wrong:

"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."

Apple's argument isn't about a deluge of one-off court orders creating a slippery slope to reducing security. Apple is claiming that complying with just this one request would make Apple's other iPhone users significantly less secure. There would be a piece of software, signed by Apple, that could potentially be used to unlock any iPhone you have in your physical possession.


Here's the exact text of the court order:

"Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE."

How am I wrong?


You said:

"But Apple's letter uses the expression "technique", which I think means they're worried the government will get another court to make them change the serial number and sign a new image "next time""

Apple's letter directly claims that the particular piece of software created to comply with this request will reduce the security of it's users. Obviously this means that Apple does not think that the SIF being hardcoded with the unique identifier of the phone (sufficiently) mitigates the risk.

"make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."

Having re-read the OP more carefully, I think ghshephard is making a different claim than you. He is pointing out Apple's arugment about the 'unprecedented use of the All Writs Act of 1789'. If Apple can be forced to compromise their security via court order like this, the FBI gains the power to force Apple and any other US company to insert backdoors / decrease security.

"If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge."


> would only load and execute on the SUBJECT DEVICE

You're wrong because any image that can be installed on the SUBJECT DEVICE can be modified to be installed on OTHER DEVICES.


The Apple 5C is an older phone not even manufactured by Apple anymore, and, for the longest time, Apple had capabilities that allowed them to brute force iPhones under direction of a court order. Apple has some concern, clearly, about it's customer's security, but doesn't care as much about this particular iPhone Model's security, as it does about the general principle that, without explicit legislation such as [1]CALEA 47 USC 1001-1010), that technology companies such as Apple would then come under the whim and direction of all sorts of requests - there is no explicit restraint in the All Writs Act of 1789 - they can now be told by police agencies to do effectively anything necessary to the pursuit of an investigation - and you know, with 100% certainty, that once the "good cases" are used as an excuse for this, the crappy follow on scenarios will also be making use of the All Writs Act of 1789.

It's massive, massive overreach - and if Apple doesn't draw the line here, it will quickly spin out of control.

[1] https://en.wikipedia.org/wiki/Communications_Assistance_for_...


I don't understand — if the iPhone 5C is so simple to brute force, why aren't Apple simply doing this for the FBI in this particular case? Why request an entire iOS modification when Apple could do what it has done for previous court orders and just brute force the phone.


My understanding is that the iPhone 5C is not _simple_ to brute force as sold from the store. The FBI is asking Apple to weaken the software in this particular one enough for them to be able to brute-force by creating custom software that would allow them to try millions of passcode combinations rapidly.

In the past law enforcement could use their own tools and Apple didn't have any legal way to say "it is beyond our ability to break it so we can't help you" anyway. After their name showed up on that slide in the Prism leak without their cooperation (meaning they had been stepped around by the FBI. Some of the earlier companies had willingly volunteered data), they stepped up their game and deployed end-to-end encryption and secure enclave to have the ability to say 'we can't help' when forced to.

This technique wouldn't be possible on the iPhone 6 due to the encryption keys being in the hardware secure enclave but they are putting their foot down now so that a legal precedent isn't established forcing them to weaken other models too. That's my understanding of it right now.


> After their name showed up on that slide in the Prism leak without their cooperation (meaning they had been stepped around by the FBI. Some of the earlier companies had willingly volunteered data)

I know it's a bit off topic but I'm really curious about this - do you have a source that shows which companies willingly volunteered data and which were stepped around? I wasn't aware that anything like that had come out.


> I wasn't aware that anything like that had come out.

You're not aware of nothing like that ever come out. The guy just made it up (or it's just his wishful thinking). We still don't know which companies cooperated with the NSA.

We do know that Google didn't intend for its traffic between data centers to be scooped up - and that has been fixed in the meantime - but that doesn't prove that Google didn't cooperate in other matters.

Same with Apple. For all we know, they are all gagged due to NSLs.


Because it sets an awful future legal precedent. If Apple did it once and didn't challenge it, you can be sure that all future government cases involving decryption will cite this case as the new standard of law.


You're both accurate here:

> The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Once there's a backdoor, the legal precedence and technical capability will exist to use it on any device. The precedence would also exist to request support in backdooring other parts of the OS.

It's FBI Director Comey's explicit goal[0] to destroy the notion of strongly secured encryption for civilians. From an an address to Congress July 2015:

> Thank you for the opportunity to testify today about the growing challenges to public safety and national security that have eroded our ability to obtain electronic information and evidence pursuant to a court order or warrant. We in law enforcement often refer to this problem as “Going Dark.”

[...]

> We would like to emphasize that the Going Dark problem is, at base, one of technological choices and capability. We are not asking to expand the government’s surveillance authority, but rather we are asking to ensure that we can continue to obtain electronic information and evidence pursuant to the legal authority that Congress has provided to us to keep America safe.

In other words, encryption makes it harder for the FBI to collect people's information. They therefore want to make sure encryption as implemented can't block the FBI.

Further on:

> The debate so far has been a challenging and highly charged discussion, but one that we believe is essential to have. This includes a productive and meaningful dialogue on how encryption as currently implemented poses real barriers to law enforcement’s ability to seek information in specific cases of possible national security threat.

[...]

> We should also continue to invest in developing tools, techniques, and capabilities designed to mitigate the increasing technical challenges associated with the Going Dark problem. In limited circumstances, this investment may help mitigate the risks posed in high priority national security or criminal cases, although it will most likely be unable to provide a timely or scalable solution in terms of addressing the full spectrum of public safety needs.

Encryption, when implemented in a way that legitimately secures a person's data from unauthorized access, the FBI can't just get in and take the data. Comey would like Congress to support policy and tools that can get around that, because terrorism.

The Apple situation feels very foot-in-door to me.

0: https://www.fbi.gov/news/testimony/going-dark-encryption-tec...


And it'd seem to open the floodgates for certificate authorities to be compromised as well. What's to stop the FBI from compelling a CA to create a special MITM certificate for a criminal investigation of a Yahoo user?


What makes you think the FBI, or a certain assisting agency, don't already have CAs in their pocket? They only need one. The 'rogue CA' threat that encouraged the development of HPKP covers this scenario. Hell, DNS TLSA records (which are a part of the now dead DANE concept) let you pin to any combination of PKI CA, public key, or certificate. One day we'll regret not deploying this stuff.


HPKP at least is growing and getting attention. I would have prefered to do the pinning on TLS level instead of http level. Sadly Tack (see tack.io) was to late and HPKP was already to far along and supported by google.

DNSSEC has some value, and DANE does as well, but sadly both are stuck in a strange limbo. Pinning can be deployed now and add a huge amount of security. Even if we had DANE, we would still want to have pinning.

There are interesting ideas how you could scan the internet and it pins and publish this information in a secure way. Then you back this trusted site pin into your browser. Its a similar ideas like Certificat Transparancy. A browser could then load itself with all the needed pins or verfy them on demand. One could also get preloaded pins from a trusted party, or use network vision to check with many different parties on first use. Lots of options once everybody has TOFU.

This combination of Network Vision and TOFU would be quite nice and CA could be replaced, at least for non EV.


I don't doubt that they do, but having the rogue CA threat established as a legal precedent that can be used effortlessly by the police scares me a lot more.


But how do you get the firmware updated on a locked phone? My understanding is all updates (historically) have required the phone to be unlocked and connected to the internet?


If in pursuit of an active investigation (i.e. the devices are still in active use), a police agency could invoke the All Writs act of 1789, and have Apple be instructed to introduce security vulnerabilities, with the next regular upgrade of iOS, such that after the phone is upgraded, it can be captured by the FBI, or whatever police force is involved, and the data recovered.

A large percentage (and presumably the the target in question) will willingly (at least today) upgrade their iOS to whatever Apple pushes out - we don't (for the most part) even question whether the purpose of that security patch is to reduce security.

The only thing that secures an iPhone is the iOS following the rules of security such as the security enclave - it can just as easily (in a new release of iOS) be instructed to ignore it.

What Apple/Tim Cook are doing here, is standing up for the importance of not being required to hodge-podge be at the whims and mercies of police agencies that demand they do whatever is required of it.


But this request was made specifically for the phone in the San Bernardino case. In which the owner is dead and the phone is locked.

This implies it is possible for Apple themselves to apply an iOS update to a locked phone in order to disable the erase-on-repeated-failure feature.


It implies that the FBI wants Apple to do that.

Tim Cook doesn't offer an opinion about how possible that might be. As a matter of principle, he doesn't believe that Apple should be forced to make the attempt.

That doesn't stop anyone else from doing so, however. And I suppose that the FBI could seek discovery on all requisite information, take depositions, etc, etc. However, I vaguely recall that discovery can't compel production of new work product. But maybe that's just a limitation in civil litigation.


That's a good point. I guess I figured if they simply can't do it why not say "it's impossible by design," rather than argue the principle?

It seems like their stand would be better saved for when a compromise is requested that is actually possible for them implement.


* I guess I figured if they simply can't do it why not say "it's impossible by design," rather than argue the principle?*

Because the larger principle is what's really at play here. Whether Apple can do what the FBI asks, or not, is irrelevant. What is relevant is that what the FBI asks is bad. Even if Apple can unlock this phone because of shortcomings of the 5C (versus later model), it sets a bad precedent for later, especially if Apple truly cannot unlock the 6s and beyond ("you could do it on the 5, why not an iPhone 7?").

I'm no lawyer, but I'm of the opinion that this is a legal crowbar for later cases, and the Feds are using a tragic incident to drum up support ("you don't support terrorists, do you?")


Maybe it's impossible by design in current iPhones. But this is an older one, not so secure.

As Tim Cook says, it would be a bad precedent. And obviously bad PR for Apple to admit vulnerability.


It is, of course, possible; any iPhone can be forced to enter the restore mode or the DFU mode (don't remember which one) and be erased/reflashed via iTunes.

It won't get unlocked by that, because as soon as the new iOS takes over, it will detect the activation lock and require the Apple ID password to be provided. Also, going through iTunes erases all the data.

But presumably a custom DFU update can only replace the OS without replacing the data. The iPhone doesn't try to protect against valid updates, and will happily run anything signed by Apple; the locked/unlocked state is only about the encrypted user data on the device.


Thanks. I was thinking in FDE terms. So maybe what the FBI wants is doable.


This is my question, too. Can iOS updates be applied to a locked phone? Seems like it should be a simple yes/no answer (and I thought the answer was 'no') but I can't find any clarity here or elsewhere.


If you physically have the phone, de-soldering and moving the flash memory to a custom board for modification is an option.


Well, looks like the ask is for a custom DFU tool:

https://www.theiphonewiki.com/wiki/DFU_Mode


The owner is alive, the phone is owned by the government and always has been.


The FBI asked Apple to install a firmware update in the RAM of the iPhone, by booting the iPhone in the DFU mode.


Absolutely this.

Additionaly, we simply don't know on what other (FISA) occasion Apple has been forced to provide feature X to agency Y where Y is not FBI...


Ok, so I completely fail to see how a random crazy guy with a gun who shoots up a bunch of unarmed people has "national security implications". This seems to be a "fact" that everyone wants to agree on, but is frankly a load of BS if one considers the government probably already has his entire call/texting history for the last couple years.

I see this as just another "its for the children" ploy, of which I'm completely sick of.

In that I fully support Apple/etc for finally gaining a backbone. If more people stood up, then I wouldn't have to be naked body scanned at the airport, or the dozens of other privacy invasions the government performs on a daily basis simply to give themselves something to do. So, rather than admit they won't ever be able to predict or protect the population in any meaningful way from random people willing to give their lives to make a statement, they waste our time and money coming up with ever more invasive ways to peek into everyone's most private possessions.


Good point. You'd think this would be something that local / state law enforcement should be handling, but the reason it's kicked up to the federal level is that the attack is politically motivated, which places it in the "terrorism" category.

The irony is that most people are afraid to stand up (to body scanners, mass surveillance, etc.) because of threats of violence from their own government, which is in itself a form of terrorism.


If it was many of the shootings perpetrated by non-muslims every day in the US, they wouldn't have cared. But because these Americans were muslims they need to scan their phones and make sure they weren't getting orders from some terrorist network.


Does it really matter? Most terrorist networks would be claiming responsibility for the attack if they were involved.


Even though the matters are slightly different, I couldn't help but think that Cook is giving off a Boards of Canada vibe in this post (in a good way).

"Now that the show is over, and we have jointly exercised our constitutional rights, we would like to leave you with one very important thought: Some time in the future, you may have the opportunity to serve as a juror in a censorship case or a so-called obscenity case. It would be wise to remember that the same people who would stop you from listening to Boards of Canada may be back next year to complain about a book, or even a TV program. If you can be told what you can see or read, then it follows that you can be told what to say or think. Defend your constitutionally protected rights - no one else will do it for you. Thank you."

https://youtu.be/1-FI6D8ZXpc


If the UK record on anti-terror scope creep is anything to go by, not creating this backdoor is a very good idea.

In the UK, laws originally intended for surveilling terrorists were/are routinely used by local councils (similar to districts I think) to monitor whether citizens are putting the correct rubbish/recycling into the correct bin. [1]

This is a pandora's box, and the correct answer is not to debate whether we should open it just this once, it's to encase it in lead and throw it into the nearest volcano. Good on Apple for "wasting" shareholders money and standing up for this.

[1] http://www.telegraph.co.uk/news/uknews/3333366/Half-of-counc... - and lest the source be questioned, this is one of the more reactionary newspapers in the UK.


And that was eight years ago - the state of affairs has since worsened - and the sheer irony of those same tory critics now spearheading the push for even broader surveillance powers. It's like crack for the political class (except rob ford) - once they start they want more and more.


RIPA was never designed for purely to fight terror - it was always designed to agglomerate all surveillance powers under one umbrella, including stuff like "putting up a covert camera to catch dog fowlers" or HRMC tapping your phone for serious tax avoidance investigation. All that has been written directly into the legislation since day 1.


Tim Cook: a really nice guy with blue whale-sized cohones.

There can be no compromise because China, Syria and Turkey would also lean on Apple to break into phones of dissidents, and pretty soon, future whistleblowers here in US too in order to prevent leaks (iPhone 7 and iCar notwithstanding).

That's the tradeoff in not giving in to faint, vague "maybes" that there were "external coordination" when in all likihood it was the ultraconservative, Saudi half leading this duo into the kookooland of violent extremism.

The security services will just have to buy exploits, develop malware, cultivate human intelligence sources and monitor everything the old-fashioned way... It's not like that kid in a YouTube video finding a jailbreak exploit for an iPhone and not releasing a tool is going to sit on it, he's going to auction it off to the shop or country with the most $$$.


> Tim Cook: a really nice guy with blue whale-sized cohones.

... backed by a company that at one point literally had more cash than the US government. A company with a strong, expansive, and experienced legal team. He's not a small fish; he's a major captain of industry and has a lot of political clout. I mean, good on him for his standing on this issue, but he wields a lot of power here.


Im generally not an apple supporter(i dont like the closed eco system), i am very plesantly surprised they posted this.

I am quite disappointed that the us courts are trying to force apple todo this, and in my opinion, its just to use this case to set a precedent.

I hope Apple cant get it to work, but id hate to see what the courts would do if that happened.


There are basically two groups of large software companies around right now: those which make their business by collecting data, and those which make their business by licensing software[1]. The first group has an overwhelming incentive to not support privacy too strongly. The second group has an overwhelming incentive to not allow too much openness. Until a better business model (or zero-knowledge machine learning) is found, no large for profit company can support both goals to their final conclusion[2]. So we are left choosing one evil or the other[3].

[1] Sure, Apple only really sells hardware directly, but the software is a significant part of the reason a lot of people by Apple hardware (e.g. 'Mac's don't get viruses', 'iPhones have a better user experience').

[2] Sure, Google has some significant internal efforts for supporting better user privacy (e.g. https://googleonlinesecurity.blogspot.com/2014/12/an-update-... ) and Apple maintains some superb open-source software (e.g. http://llvm.org/ ). But in the end, Google can't be a "privacy company" without hurting their business model and Apple can't be an "open source company" for the same reason.

[3] Or the non-trivial inconvenience of being a self-hosting free software purist


Actually, there is a third business model which doesn't exploit its users' privacy, nor does it have to resort to closed-source licensing (but selling add-on services like support and customization). This is the model adopted by companies like Red Hat, Canonical and to some extent Google (in a few products).

The future, in fact, belongs to this third business model that helps a business earn profits without hurting its users in any manner. Ultimately, all the IT companies will have to embrace this model in order to stay in the market and survive. Competition will ensure that they will.


The reason I specified large software companies, is that support and customization works well up to a certain scale, but won't give you a "big five" size company. A software Mittelstand which profits from support and customization might indeed be a possible future, but I think is far from a certainty. Also, I wouldn't consider that to be without disadvantages.

Two example disadvantages:

1) As it is, support and customization for specific non-technical paying users are among the things many top engineers least like to do. The reason being that it takes away from time solving the problems of the large mass of non-paying users. Even under the support & customization model versus the proprietary model, the number of paying users is much smaller in the first case in general, which creates a smaller "high priority" class of users.

2) Certain features and applications, such as traffic-aware maps or voice recognition engines are easy to build by huge centralized organizations which hold all the necessary data. They are challenging things to implement for loose collectives of smaller software companies, specially in a privacy-aware way.


I can't upvote enough that excellent summary of the situation of software companies.

One way to solve that would be to have governments support and subsidies open source software development, but I don't see that happening in the next 5 years at the very least.


So, it is far from a simple problem. For common infrastructure, one can argue governments could fund open source in the same way they fund highways and bridges and physical fiber optic links. But I am not so sure that model would be the best to innovate in end user applications, and I say so as someone working on publicly funded research software.

There is something to be said about the market's ability to make decentralized decisions and focus on satisfying people wants[1], so a centralized software economy is also not a good solution. The problem with markets here is that strong privacy and open source are, for the most part, positive externalities. As a user, the benefit you get from having strong privacy yourself is not usually noticeably high, nor that of having access to the source, specially for a non-technical user, yet society arguably benefits from both. Usually the answer to a problem of unaccounted externalities is government regulation, but in this case, large-enough-to-matter governments have been unanimously on the side of less privacy, rather than more (as is the case of the original article).

[1] Ideally, software should be designed so that it preserves privacy as much as possible while achieving its function, and is open source, and it provides all the million features and reasons people use something like Facebook, Snapchat, Youtube, etc. Just having privacy preserving software written for and by technophiles is not and can never be a complete solution.


For individual projects, it has already happened. BSD development, for example, was sponsored by DARPA in the 1980s. The German government supported the GnuPG project for a while. I am certain there are more examples - but you are right in that it has not been done systematically.

(Which does not invalidate your point one way or the other.)


Not quite the same, I can appreciate, but inroads are being made with the UK Digital efforts being mostly open source.

http://blog.quickpeople.co.uk/2013/05/17/the-uk-government-p...


Keep in mind, it's not just the courts ordering Apple to do this on a whim. The owner of the iPhone in question has literally consented to search. It's akin to the owner of a safe asking the safe company to open it up because he doesn't have the combination, and the safe company can do it, but won't because of the fear of a slippery slope.


I'm really impressed that Apple is standing up to the government and protecting its users' rights. I've never really considered the iPhone worth the premium price tag, but policies like this have changed my mind.

Could someone answer a question I have though? The government wants Apple to create this backdoor and tailor it to the specific device, so presumably it will have a line that goes

    if (!deviceID.equals("san_b_device_id")) 
        return;
To make the backdoor general purpose, this line would need to be removed. But doing so would invalidate the signature and it can't be resigned afterwards because the attacker won't have Apple's signing key. So is the open letter a matter of principle that they won't build any backdoor, now or in the future, rather than a specific concern about this backdoor?


Technically you're right, legally think of the huge precedent. FBI is using the San Bernardino case as a legal crowbar, and it's awful.


If they beat the order they could always do what GP said, crack the password, and hand in a decrypted copy of the device. Everyone goes home happy: no precedent set, FBI gets their data, going into the future as old iOS devices die Apple won't even be able to pull that stunt again if they want to.


But if they beat the order, and that they've made a big deal about going against the order, why would they go ahead and compromise the device's security? What would be the point? Just to tell the government, "Hey, don't worry about all that stuff we said, we didn't mean it?"

If they do it once, they'll do it again.


1) They're not saying they don't want to help investigate the SB shooters, only that the order illegally expands the use of the All Writs Act and sets a bad precedent for democracy.

2) If they beat the order then the FBI needs to find a new way to compel Apple to help them do shit. That likely means the FBI needs federal legislation passed, which in the current climate will buy Apple considerable time. This is why they want to beat the order (though the feds could further appeal all the way to the.... wait for it.... 8 judge supreme court!) Though I think Scalia would have be on our side on this one). It's not about this particular case since there will be others, it's not about this particular order since there will eventually be legislation.

3) So why go ahead and do it anyways? Naive (but still valid) reason: they happen to be able to help and without it being ordered they can do it without handing over a tool for ad hoc decryption. They can even go to Hawaii after and throw the dev machines into a volcano to satisfy their inner hobbit (I highly recommend this part of the plan) to ensure that no one can abuse the power of the one ring.

The non-naive answer is that for a quick project they get to show to the public that no, we the nerds are not so obsessed with abstract systems level thinking that we won't help when we can. It gets a lot harder for that moron Comey to hit the morning shows and throw shitty innuendo at the tech industry implying that we're aiding the terrorists.

Both encryption and terrorism are complicated subjects that are scary to the average American and although they distrust the government, they also distrust Silicon Valley. The cryptowars aren't about being right or they'd have stayed dead in the 90s where they belong. Basically Apple makes tech look like the good guy fighting terrorism, and for anyone who cares (smaller audience than the fighting terrorism bit) they also defended your civil liberties.

4) This trick only works on older devices. They will die out soon anyways. Newer devices are safe anyways. If they beat the order and do this one case voluntarily then no precedent is set, so they can't be bullied into doing it and old devices are safe.

One device compromised, all other devices safe, order beat, PR win, Comey looks like a prick even to the uninformed next time he insinuates that we're the enemy.


But this all hinges on the naive assumptions that this is a one off and will never happen again, and that later generation devices are immune from any circumvention attempt. History says this isn't how this plays out.

If you crack the encryption once you'll get orders to crack it again and again, and in much lower profile and lower stake cases. Look at the prevalence of espionage tactics such as Stingrays and "parallel construction" by law enforcement. There may not always be someone you can pump up into an crack international terrorist, but there's always some low level drug courier, or a "quality of life" criminal to use your new toys on.

Also you're saying hat this doesn't set a precedent, but it does. Sure there's not a court case to point to, but it's a precedent none the less. It's that the company not only has the means, but the will to do it. What's stopping the government from coming back a second time, or a third time about this? What argument do you have on either a legal court or the court of public opinion to make when you stand up and say, "That first time was an exigent situation, and so was the second, and the third... But this time, the fourteenth time, THIS TIME we really mean no more!"

Finally, I don't think this trick only works on older devices. The FBI wants them to be able to brute force the passcode through a USB connection instead of making some sort of robot to tap the screen a bunch of times. Also presumably the FBI wants the the two many incorrect attempts lockout feature disabled as well, otherwise their just going to be waiting for hours on end. Why wouldn't this rather low sophistication approach work? from a technical stand point this is no more complicated than a mouse jiggler[0]. Of you're arguing that iPhone 6=< have some sort of "Mission: Impossible" self destruct mechanism, I'm sure it could be disabled given enough resources and motivation.

Finally (for real this time!), making a big stink and then capitulating is never a PR win. You just look like a tool to everyone involved. To the anti-encryption side you're a weak and can be rolled, and to the pro-encryption side you're a sell out.

[0] https://www.elie.net/blog/security/what-tools-do-the-fbi-use...


Both Paris and SB were great examples of the terrorists basically not bothering with using encryption, if they had actually used and benefited from encryption we'd be discussing how to get cryptography re-legalized.

I doubt there's even anything on the phone the FBI don't have from other sources. The reason they're using the All Writs Act with this case is because of the publicity of the case so they can point to Apple prioritizing some vague principle most people don't care about over real dead people. Apple's doing a good job making their argument to those who care about said vague principle, but not to the general public.

I think you're underestimating how badly we're going to be taken to the woodshed on this the first time it's opportune. My be it it'll be some cute little girl dies in a Nancy Grace friendly way and the FBI manages to convince the public that "if only we could have broken these messages" etc. It might even be true in that one freak instance, but then we'll have "[cute_little_girl.name]'s Law" which will make sure that such a tragedy never gets exploi.... reported again, by making sure that the government can read messages when they need to. The US market is too large not to capitulate at that point. That bill passes if the voting public can be stirred up against the greedy and aloof tech sector. It doesn't pass if they see the tech sector and its goals as reasonable, and while we should continue trying to educate people on why encryption is good and important for them, you don't change the number of minds we need to change with rational arguments (or again, we'd have won already).

Edit: very interesting link on the mouse jiggler though, thanks for that!


We're already in the woodshed. We've been there for 15 years.


Yeah, but it'll get worse: https://www.washingtonpost.com/news/the-switch/wp/2016/02/17...

Choice quote: "Apple will become the phone of choice for the pedophile"


wrt to "Of you're arguing that iPhone 6=< have some sort of "Mission: Impossible" self destruct mechanism, I'm sure it could be disabled given enough resources and motivation."

The iPhone 5S and newer has a coprocessor (or co-computer) that has a hardware enforced rate limiter as part of one of the features of the "Secure Enclave" (which, word on the street is, cannot be overridden by software).


ORIGINAL: Physical access.

EDIT: It's not just physical access. Physical access chains the game entirely, but what the FBI is wanting highlights the physical access problem even more. They're wanting a custom software solution today, but there's nothing to say they can't want a custom hardware solution tomorrow. Sure the enclave has some sort of lock out now but who is to say you can't simply reflash the firmware or perhaps just solder in some jumpers? Make no mistake. The FBI is wanting manufacturers to modify devices on demand.


Once they build that in for one device then they have opened pandora's box.

Then it becomes a precedent in the courts that Apple has this ability so they will issue court orders to make them comply for every single case where a phone is encrypted.


One way around that is for Apple to make it extremely costly for courts to issue many of such orders, because after all Apple are free to charge whatever they like for doing this service.


Only a reasonable amount though. They could charge $100k or even $1m the first time, but they'd have a hard time justifying similar fees if only an if condition needs to be changed. Worse, they would lose the moral high ground if they even tried it. They'd go from being a good corporation standing up for the Constitutional rights of its users to a scumbag organisation trying to bleed taxpayer money for helping criminal investigations.


The the court says "lol, make it default then". It doesn't matter how much they charge, when courts can override the decision. What apple needs to do is invalidate any way for this to happen.


The court can't make an order that a third party is not allowed to recover (valid) expenses in complying with a court order (if it gets to the point that this becomes so ordered).


They could charge a million for this one time, but after that it's just the edit of that if statement. It's not justified to charge a million each time.


I think US govt would be kinda ok with that idea if Apple didn't park their profit overseas to avoid paying tax...


What are you talking about? Why should should apple have to contribute anything to the safety or wellbeing of the peasant American society? /sarcasm


To make the backdoor general purpose, this line would need to be removed.

Or the ID on the device be changed to match "san_b_device_id".


Publicizing the case themselves in a very good move.

However, the iPhone of the attacker is an iPhone 5C, which does not have Touch ID or a Secure Enclave. This means that the time between passcode unlock attempts is not enforced by the cryptographic coprocessor. More generally, there's no software integrity protection, and the encryption key is relatively weak (since it is only based on the user's passcode).

The amount of work needed to turn security into good user experience is phenomenal: https://www.apple.com/business/docs/iOS_Security_Guide.pdf


And to think, just two weeks ago that same enforcement which prevents compromising the secure enclave was the target of vitriol right here on HN (Error 53).


You can still enable full disk wipe after 10 failed password attempts[1]. That was available in iOS 7 I believe (but someone on here will correct me if I'm wrong).

[1] - http://i2.wp.com/ioshacker.com/wp-content/uploads/2014/09/Pa...


The FBI's point is that Apple can update the software on the device to not honor that setting, which means it is not effective against a government attacker.


It's been available for quite a while, long before iOS 7. It was at least available in iOS 5, probably available before that.


This may be one of the most important things Apple has done. Whether or not you agree with their position, it's incredibly important that tech companies start publicly explaining things like the fundamental problems with backdoors so that a lay person can understand it. Apple have the credibility to make non-technical people take their argument seriously, and the reach to get the message out to a vast number of people. I'm really pleased they're taking this position.


This is interesting:

"Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."

Am I reading this right? Apple, if they chose to, can make a version of iOS that disables security features and encryption and load it onto existing phone even though the phone is locked and encrypted?


> Apple, if they chose to, can make a version of iOS that disables security features and encryption and load it onto existing phone even though the phone is locked and encrypted?

As I understand it, the FBI wants Apple to create a version of iOS that would disable the current feature where the data is deleted after more than 10 failed passwords attempts. This would allow the FBI to brute force the password.


That doesn't explain how they would get the update on to a locked and encrypted device, even if it existed.


They want to use DFU to build a version that is loaded into RAM and run from RAM rather than Flash.


It seems like it would be easy enough to crack it open and replace the OS boot data.

That being said, I really WANT the data in this case. I hope Apple finds a compromise where they can help get this specific data without risking leaking a compromised OS.


> I hope Apple finds a compromise where they can help get this specific data without risking leaking a compromised OS.

I want pharmaceuticals without side effects, and real doughnuts that don't make you fat.

Seriously, you are asking for "A" and "not-A" in one sentence. Take your pick. Are you willing to get this one phone unlocked so badly that you would be OK with nobody having security? Because that's what you're asking for, whether you realize it or not.


Perhaps you're not technically deep enough to see other solutions.

Perhaps you are trying to disagree with a point by conflating it.


See other comment: https://news.ycombinator.com/item?id=11116439

Potentially, Apple cannot circumvent their own protections on some models (in software anyway), and could in others.


Thanks. This explains it. I was just thinking about my old Thinkpad X41. That had TPM module and hardware encryption. There's nothing that IBM or the TPM manufacturer could've done to decrypt it (unless the TPM module already had backdoors, haha). Latest iPhones are basically the same?


I don't get it. How would they be load the backdoor is the phone is still locked?


iTunes recovery mode? It allows to restore the OS, but also gives you other means to manipulate the device. Imagine if they simply hot-patch the lockscreen to allow any number of tries of password?


Recovery mode with DFU and the like results in a device wipe.

There are many ways to redo the firmware, but every single one of them, by design, requires wiping the phone to implement.


The outcome of entering DFU mode for most of us is a device wipe but this doesn't have to be the case for Apple.

[0] https://news.ycombinator.com/item?id=11115945


You can reinstall a phone from recovery mode without erasing it.

https://support.apple.com/en-us/HT201263


What I don't understand is why Apple could create such software but a hacker could not exploit it. I feel like that mean that there is already a backdoor.


Because you can only deploy it, when you have Apple's private key.


And that's one of the main reason for Apple to say no to that.

There is no way that a backdoor like this could not be exploited by someone else if they found out the way to do it.

The other one is that there is no way the government can guarantee Apple to only use in the "right" cases after they have access to it.


As you say, 'Apple could create'. It doesn't exist, what could the hacker even target?


I think the important word missing in that phrase is "any _future_ iPhone".

For existing devices I imagine that a hypothetical iOSX which removes the protections would require you to enter your passcode so the OS can decrypt the data and then re-encrypt them using the new backdoored option.


The last part actually surprises me too.


I think the ask is for a custom recovery mode tool / image.

See https://www.theiphonewiki.com/wiki/DFU_Mode

Not same thing as an OS update.


Looking at the wording, seems to suggest they never state whether or not it is possible. It says "the FBI wants us to make", not what we can or cannot make. "Potential" doesn't mean certainty.


Very impressive letter. They've expressed their position in language that a layman can understand, there's abundant evidence that they respect the intent of the law authorities, and even clearer evidence that they are drawing a line in the sand based on their principles. They will protect their customers.

I wish more companies could speak so clearly and courageously.


>They will protect their customers

Where is this stated so that I can claim damages if they break said promise.

I'm sorry, but how can it not be seen that it a really is bad sign that Apple has made this public. They may already have built the backdoor and this is a public stunt or no matter what you do, owning a smart is not that smart.


If you'd like to claim damages, you would have to prove damages.


Love makes blind.


This is the sort of thing that a professional organization - like what medical doctors have - could help with. Let me explain.

The court order gives Apple an out: "To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief".

Now, imagine if this was court ordering a company to engage in unethical medical procedures, rather than unethical software development. The professional medical community would sanction doctors that cooperated and support those that stood by their ethical principles and refused to cooperate. If there was a similar professional organization for software development, Apple could reasonably rebut that telling their engineers to work on this would be unreasonably expensive (since they'd expect to fire people or have them resign over it).

This is another avenue for fighting the order - have a good chunk of Apple's engineering department sign an open letter saying that they'd resign before working on that project. The incentives seem like they'd work for making it a thing.


Professional ethics of software engineering is definitely something we're going to have to grapple with more and more. Another aspect is being asked to use Dark Patterns in a UI or build a skinner box into a game or app. There's evidence that these things do harm to people and having a professional organization that could help stand up to such things could be part of a solution.


So the FBI is asking Apple to build a tool that will unlock security measures of an existing iPhone, like the one in the San Bernadino shooting, and allow it to be read.

The problem with this is that no such tool should be possible to build. It should not be a matter of yes or no; it should be simply impossible for Apple to build such a tool without the private key of the user, which Apple does not have.

If it is possible to write a piece of software which can circumvent the protections of the iPhone without the user's private key, then Apple wrote its security software incorrectly. Either they wrote it with an appalling lack of security understanding; or they left in important backdoors, either knowingly or through ignorance. But if they wrote the software correctly and did not create backdoors of which they're aware, then the government's request is actually impossible -- cannot be done.

So which is it, Apple? Is the point moot because you did this right? Or have you already placed backdoors in the product which the FBI is now asking you to exploit for their benefit?


The point is there currently is no backdoor. FBI wants Apple to create (and sign) an OS update with a backdoor and install it onto the suspect's phone. Specifically the backdoor is to remove the rate limiting and 10 attempts limitation on trying the passcode.

If you have a very strong passphrase (not a 6-digit code) then even that should be unbreakable even with brute force. Of course, most users have the 6 digit code.

If you read the actual court order a lot of your questions are answered. Here: https://www.techdirt.com/articles/20160216/17393733617/no-ju...

Also, the phone is an iPhone 5c. This doesn't have Touch ID and doesn't have the secure enclave. The same approach would not even be possible wouldn't even work on a 6 or 6s. http://blog.trailofbits.com/2016/02/17/apple-can-comply-with...


> The point is there currently is no backdoor. FBI wants Apple to create (and sign) an OS update with a backdoor and install it onto the suspect's phone.

If this is possible without the owner's permission, then the update mechanism is the existing backdoor. It just happens to also be the front door.


I think the missing information here is how the phone is encrypted. If it's done with the 4-digit numeric PIN, then the software could be built; it would take 10000 tries, but at less than .1 seconds per try, it would be able to crack the code in about 15 minutes. The current iPhone has a protection for this; after some number of tries, it will lock you out for increasing time intervals.

This is the only way that their claims might possibly be valid.

And a reminder, then: change your iPhone's password to a more complex one. If apple doesn't make this fake OS, someone will.

Edit: to expand on this, Apple's PR goal was to take advantage of the NSA mass surveillance scare. On-device encryption is not very relevant to that. iCloud security is much more important, and they've been quietly granting data from it to the Feds. Including iPhone backups which contain most of the data they're looking for.


Check out this comment where Adam Stew explains how the phones are secured: http://slashdot.org/comments.pl?sid=8756397&cid=51524693


That's really cool, thanks for the link


I mean this sincerely: has the government used one of its 10 tries on the attacker's birth year? I hope the government has burned a couple tries on low-hanging guesses before going through this legal hassle.


You don't want to burn any tries in case the Apple developers would need a few?


Is there a way to know how many attempts remain before trying a key?


Also note that new iPhones default to 6 digit pins. Still possible, but harder.


I don't understand your comment.

The iPhone in question is protected with an unknown passcode. Auto erase is enabled, so brute-forcing the passcode will erase the data.

However, a new OS version without auto erase and that accepts passcode input from USB would allow the FBI to try all combinations.

How is Apple at fault because most any passcode scheme can be cracked via brute-forcing all comginations?


> However, a new OS version without auto erase and that accepts passcode input from USB would allow the FBI to try all combinations.

It shouldn't be possible to just add a new OS onto the phone without the restrictions in place, without knowing the passcode first.


I see what you're saying but this has never been done before. BIOS passwords could be bypassed by draining the battery. Encryption is practically the only way to protect your data, because the storage can be taken out of the phone and hooked up to something else if need be.

Making a new OS is just the easiest way for Apple to do this; there are other ways.


While I agree, this is not what OP's comment was about. He makes it sound like Apple is forced to write a decryption tool that exploits existing backdoors into the encrypted content.

Highly ironically, the current "Error 53" hullabaloo is exactly about what happens once security it tightened to the extreme.


iPhone 5s and newer have a Secure Enclave, which limits brute force-ability and can not be changed via a software update (there is a belief, though undocumented, that any firmware patches would also wipe the stored keys). Apple could not help the FBI to get into these phones.

The phone in question, however, is an iPhone 5c, which does not have a Secure Enclave.


It's not possible now. The FBI is asking Apple to change iOS so it will be possible in the future.


The FBI is asking that it be built now and then loaded onto the already recovered phone.

> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.


Why the downvote? That is what I said. I merely used a slew of different words.

> The FBI is asking that it be built now

Because it's not possible _now_.

> and then loaded onto the already recovered phone.

Thus it becoming possible after they have built the new version of iOS, and since they cannot go back in time and build it, it would indeed be _in_the_future_ that it became available, if Apple complied, that is.


I didn't downvote but Cook's letter suggests that the backdoor needs to apply retrospectively to existing phones (not just future phones).

Hence the parent post's suggestion that the argument is moot -- if Apple has the capability to retrospectively backdoor existing phones it would imply that Apple didn't secure it in a foolproof way in the first place.


I see a lot of people saying they're impressed, admired, etc. at Apple for doing this.

It's not about giving props: Apple is not doing this out of goodwill, or because they believe in protecting privacy. Apple has a competitive advantage against Google/Facebook in that its business model does not depend on violating their customer's privacy.

They are just exploiting that competitive advantage.

Cfr. https://ar.al/notes/apple-vs-google-on-privacy-a-tale-of-abs...


What you're saying sound like a win-win situation for Apple and its customers. So whatever the reason is for them to speak out, it's a good thing that they do speak out. Isn't it?


A man went to his local parish priest and asked "Father, is it permissible to smoke while praying?" "No, my son, when praying you should show the utmost respect and attention to God" he answered. The next day, another man asked the same priest "Father, is it permissible to pray when I smoke?" "Of course, I encourage you to make your whole life one long prayer!" was the priest's answer.

So, is Apple defending rights while advertising, or advertising while defending rights?


It will become obvious, I think, in the following days. If they challenge the court order and fight vigorously to have it lifted, I am willing to believe that it is about the privacy. If they give in and say, "but we really didn't want to, they made us do it", I will consider their resistance more of a PR stunt.

That story, by the way, is really nice. ;-)


I'd place a hefty wager on Apple fighting this until there is no legal recourse.


I sure hope so! And given how much Apple has to loose if they just cave, it is reasonable for them to do so.


Exploiting is a bit loaded. Where I’m from, we call this “a free round”. There’s no downside to doing this, and a sizeable one to complying with the order. Still, it’s nice to see that Apple don’t hesitate, which makes me feel good about any other future challenge the feds will throw at Apple.


If you want to describe Apple's actions in terms of market competition, then you should also describe the consumer (our) reaction in similar terms.

We as consumers should support the companies that have the best, in terms of product usability as well business ethic, market strategy by buying said companys product and recommending and commending their actions.

So props to Apple for leading an ethical business strategy.


> Apple is not doing this out of goodwill, or because they believe in protecting privacy.

Personally I think this is ridiculous.

If Tim Cook was born in a range of other countries around the world he would be persecuted and quite likely killed (and not in a pretty way). As such I am sure he is very aware that for many people their privacy is a life and death matter.

Also there would be many people at Apple who would've been dealing with China's aggressive attacks against their users and their own infrastructure and not be too pleased. And to a lesser extent their own government.

I can't imagine the extra few billion that comes from goodwill being a major factor in their efforts to go to all of this trouble.


I don't see why it couldn't be both - seems to me like they both believe in protecting privacy and (possibly even therefore?) follow a business model which deeply respects privacy.

Seems to me both of those things are worthy of respect in today's society.


Their business model just happens to align well with what is best for their customers. It's interesting that it's almost an anomaly. The business models of Goole and Facebook in comparison does not really align with what is best for their customers.

Apple are a company so they'd be stupid to not take advantage of this and as a consumer I am happy that the premium I pay for Apple products is benefiting me.


We've seen that bad that comes of corporations chasing a bottom line, perhaps this is the good.

And honestly I don't mind if my digital rights are defended as an advertisment.


> They are just exploiting that competitive advantage.

... and that's what I paid them for, thanks for not taking money both from me and from FBI :)


Why wouldn't they believe in protecting their customers privacy? In Apple's case there is no conflict of interest to make you doubt that statement. If it came from Google or Facebook I would be more skeptical.


So, their motivation is even better and more effective than idealism.


"we believe the contents of your iPhone are none of our business."


You are right. What you say about Facebook is true, but Google's Android is open source, so there is no way they can plant a privacy-invading code and get away with that.


No phone on earth runs the open source version of Android that you can download from git. They all run custom versions that include not only closed source personalizations to the system, but they also run lots of closed code as root (play services first and foremost).

The reason why this doesn't happen with Android is much more mundane: most Android phones are not encrypted so the FBI doesn't need help to read all the customer data. They just need to open the phone and dump the flash.


> open source, so there is no way they can plant a privacy-invading code and get away with that

Compare and contrast with today's "Typo in PHP's Mersenne Twister" story - https://twitter.com/i0n1c/status/699860681487708160

Seems to have been broken since 2007 (the 'broken' line appears in the 2007-01-01 commit "Bump year" but not in the 2006-10-06 commit "Mark rand.c functions with U.")


Just because it's open source doesn't mean it's safe. You have no control of what happens to that code before it gets installed on a phone. Samsung, whomever can and do modify the code -- those modifications aren't generally open source.

Ruby on Rails is open source but that doesn't mean that all applications on rails are open source.


> Samsung, whomever can and do modify the code -- those modifications aren't generally open source.

That will certainly change quite soon. In the earlier days, FOSS was not a concept that masses were aware of, but now is different. There is increasing competition in the smart-phone world and if one of the other manufacturers (say ASUS) makes their Android modifications open-source, they will see a drastic increase in Sales. To keep up with competition, Samsung, etc. will also have to do the same. In other words, competition will ensure the success of open source.


>if one of the other manufacturers (say ASUS) makes their Android modifications open-source, they will see a drastic increase in Sales

Can you expand on this? I don't think enough people care about source access. RMS's exact hardware choices don't go on to sell millions.

edit: I didn't see your username before now.


There are parts of Android phones that are closed and proprietary now. Even CyanogenMod.


None that are necessary to run it, still (not including drivers, but that's hardware dependent)


> "Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."

So there is already a backdoor. Apple are refusing to let the FBI use the backdoor.

The backdoor is the fact that Apple can push a firmware update that disables security features, without the device's password being entered at any point.


I like the position apple is taking, However, after reading the letter, I noticed it misses a point I consider even more important than just "a dangerous precedent".

Apple is selling devices on the whole planet, not just in the USA. So, what's the FBI (an American agency) is requesting is not dangerous for only American citizen, but also for iPhones' owners in Europe, Asia, Africa, Oceania. Hell, these people are not even part of the debate, because they don't belong in the "American democracy".

If I'm going to be affected by someone else's policies, I would like to be at least allowed in the discussion.


I wouldn't worry too much about non-American apple users. Those of us outside the US know how tangled law and tech can be - and so people guarding information tend to use non-American cloud services for example.

If this goes through I just expect more people internationally to choose something else. Not a big loss to them, but the loss of business to Apple (a US company) might be felt.


As others have noted, this is probably mostly about branding. But that's why it is genius. Tim Cook is committing Apple to this pro-privacy position in a very public way. This means that a reversal of this position or a revelation that Apple has been acting contra it, would be extremely expensive to Apple's reputation with its customers, effectively costing the company a huge amount of money.

By publicly committing Apple to this cause, Cook makes it more likely that internal teams at Apple as well as future versions of the company will adhere to this position. By defining a set of actions which, if made public, would ruin the company's brand, Cook makes it less likely Apple will take those actions.


You're exactly right about the effects on the company as a whole. A CEO is like a cheerleader and ship's captain for a company.

Nilay Patel over at The Verge said on one of their podcasts he once asked Satya Nadella what it was like to be the CEO of a company as large as Microsoft. Nadella told him being CEO meant telling a big-picture vision to the press and the company over and over again until everyone started going in that direction.


I'm clearly in the minority here, but I don't really understand Apple's position here, nor do I understand why everyone is rallying behind them.

Apple built hardware which was not particularly secure. The software defaults to a four-digit PIN. They attempt to mitigate this by adding an escalating interval between entries, and by optionally wiping the phone after too many failed tries, but this is not set in stone and those limits can be removed with a software update.

The government is coming to Apple and saying, "You can remove these limits. Do that for us on this phone." Coming as a legitimate court order, I see no problem with this request. The government isn't even asking them to crack the phone, they just want Apple to remove the limits so the government can try to brute force it. They're even paying Apple for their trouble.

If Apple didn't want to be put in a position where the government can ask them to compromise their users' privacy, they should have built hardware which even they couldn't crack. And of course they did; starting with the A7 CPUs, the "secure enclave" system prevents even Apple from bypassing these limits. The phone in question just happens to predate that change.

If the government was demanding that Apple do the impossible, I'd be on their side. If the government was demanding that Apple stop manufacturing secure phones, I'd be on their side. But here, all they're asking is for a bit of help to crack an insecure system. They're doing this in the open, with a court order. What's the problem?


> The government isn't even asking them to crack the phone, they just want Apple to remove the limits so the government can try to brute force it. They're even paying Apple for their trouble.

Well this exact thing isn't THAT big of a deal but it's a slippery slope. If Apple agreed to this then what else can the government ask them to do under the banner of "public safety"? And if Apple were to give the government an electronic way to brute force the touch codes, it would break the trust of every iPhone owner.


I don't see the slippery slope here. The government is asking Apple to do something that is both possible and reasonable. I see no slope to that from other typical court orders.

Giving the government a way to brute force PINs wouldn't break the trust of every iPhone owner, merely the owners of iPhones with pre-A7 CPUs. And great, if they trusted Apple on this their trust was misplaced. You can't trust companies not to unlock stuff when the government requests it with a legitimate court order. If you want Apple not to decrypt your data, the only way to ensure that is to make it so they can't.

Again, Apple has (so far as we know) made it so they can't, on newer hardware. But this phone that the FBI is trying to get into is older hardware and built such that Apple can get into it. If you're looking to point fingers, blame Apple for building not terribly secure hardware. But don't point fingers too hard, because they're doing it a lot better now.


The slope is that if they can order Apple to engineer one thing, they can order them to engineer another.

It is possible for Apple to the weaken the secure enclave on all future iPhones. It would be reasonable to do so from the point of view of giving law enforcement a useful tool. Therefore since Apple can be ordered to do engineering to make law enforcement easier, why should they not be ordered to do this?

That is the slippery slope.


> The slope is that if they can order Apple to engineer one thing, they can order them to engineer another.

How does that at all follow? Right now, a cop can lawfully order me to identify myself. Does that mean they can also lawfully order me to go to the nearest coffee shop dressed as Bozo the Clown and shout, "I am in love with the ghost of Princess Diana"?

I don't understand how complying with an order to use an existing security hole to break into in someone's device somehow sets a precedent that the FBI can in the future go to Apple and set the parameters for how their products are designed.


Because the hole doesn't actually exist unless Apple engineers custom code for the FBI. If the FBI can force Apple to engineer code to create security holes for them, that establishes a precedent.

Explained better by someone else here: https://news.ycombinator.com/item?id=11120036


I would argue that the hole is the fact that Apple can even load new software that allows this attack. It already exists.

But I'm not sure that distinction is important. The other comment you linked to lays it out pretty nicely and it doesn't rely on a hole existing it being created. It's ultimately just about compelling creation.

I wonder, what if the FBI just requested the relevant signing keys and source code? That seems like a much worse outcome, but at the same time less of a reach.


Why is that less of a reach?


Because it's just asking them to turn over information the authorities need for their investigation, which is a pretty normal sort of request. None of this troublesome asking them to build new software.


Fair point.


"Per US versus Apple in the San Bernardino matter, we request a warrant be issued for encryption bypass in this (separate) matter".


They address this at the end of the letter. They say it's "an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority." They go on to talk about what that precedent would mean. It's at the very bottom.


I don't buy it. The FBI is not trying to dictate how Apple builds their devices. They want Apple to take measures to unlock one device. How do they get from that to "[the government] would have the power to reach into anyone’s device to capture their data"?

Apple seems to be saying that if the FBI can ask Apple to install special software on one person's phone, then they can ask Apple to install special software on everyone's phone. But that's not how it works. The whole idea of requiring a court order is to only do this stuff on a limit scale when there's justification for it. It's like saying that the police shouldn't be able to have a warrant to search a suspect's house, because that means they could search everyone's house.


I disagree. This is all about setting precedents. Once one of the dominoes falls, it's much easier for the others to start falling as well.

Apple is saying the FBI is using this law to expand their power to mandate a backdoor in all devices. If this is successful, then the FBI can mandate that all secure hardware/software companies backdoor their products.

Do we roll over and let the FBI do this because "oh, this is just one case, it's fine" or do we set a boundary? If a child does something wrong, you scold them immediately. You don't wait for them to do it the 10th time. By then it's too late.


Apple says the FBI is using this law to mandate a backdoor. And that's exactly what I don't buy. The FBI's demand here is merely to use an existing security hole, which Apple created.

I simply don't see the leap from "this device is insecure, pleas unlock it for us" to "all devices you make must be insecure."


Your response feels very naive or short sighted. Or both.

If this goes through, you better believe that there will be court orders left and right, which can't be authentically argued against since Apple has already done it before.


Court orders for what, exactly?

If there are legitimate court orders for cracking the security on the phones of criminal suspects, I don't have a problem with that.

The problem would be if:

1. A court orders Apple to crack the security on a phone they cannot actually crack (presumably any A7+ phone), and imposes some punishment for failing to do the impossible.

2. A court orders Apple to modify the design of their phones to make sure they are always crackable.

Those would be huge problems. But I don't see how you get from here to there.


Is it possible to update a phone without the user accepting the change (with the phone locked)?

Do you want such a tool (the one the removes the security of updating) to exist so that anyone with physical access can replace the OS with something else?


Apparently it is possible to update older phones like this. On newer ones, the OS in general is not an issue, since the security bits are handled by the secure enclave. I'm not sure what the software update policy is (it's only briefly mentioned in Apple's iOS security paper) but I'd wager that it's impossible to push an update to the secure enclave software without either unlocking it or erasing the crypto keys.

I don't understand what you're getting at with the "tool" question. Nobody's talking about building something that lets anyone with physical access replace the OS. The phone's secure boot system will still require updates to be signed by Apple. Apple can replace the OS with physical access. On older hardware, it seems they can do this without wiping the data. Do I want Apple to be able to do this? It doesn't matter what I want, the fact is that they can. Since they can, and since the FBI has a court order, I don't see what's wrong with requiring them to do so. If you don't want them doing this to your phone, buy a newer one with the more secure hardware.


There's a simple way to defeat Apple's argument. The judge could simply ask Apple to flash the new firmware on that phone, let the FBI run the brute force under their supervision and obtain the contents they need, and then flash back a non-compromised version of the OS.

The government would never have access to a phone with a compromised version of an OS that they could use to repeat this trick. Rather, the government would have to obtain court orders and have forensics done under supervision.

This isn't a backdoor and doesn't affect consumers, and sets a really high bar to trying to scale this for the government because it requires Apple as the gatekeeper every time to agree to do the one-off hack.

The cynic in me thinks that this letter is more about brand image. Apple wants to claim they can't hack their own phones, even if the government asks, but clearly in the case of the iPhone 5C it IS possible for them to do it, and this creates a contradiction with their public marketing and privacy position. If they didn't release this open letter, then simply complying with the judges order would make them look bad.


1) Merely creating the software/firmware in the first place is extremely dangerous, allowing for a failure point of any engineer or custodian of the firmware to leak it, destroying everything.

2) The strong precedent will be set that governments can get whatever they want from our devices by whatever means necessary, and it will only get worse.


"The cynic in me thinks that this letter is more about brand image."

The cynic in me agrees. My cynic also notes the sudden indifference to corporate power publicly defying a federal judge. I cynically believe that if the shooters had been fundy white supremacists Apple would have quietly dumped the phone long ago.

Cook is no dummy and picks his battles with care. That's why he makes the big bucks.


If Apple doesn't fight this, a precedent will be set, allowing for the government to compel companies to release user information or ways of circumventing encryption in the future.

2) the government doesn't need to have access to the firmware for it to become dangerous, the sheer fact that this can be created is dangerous. What's to stop a determined, hacker from doing the same thing in the confines of his/her bedroom once it's confirmed that this is possible?


The problem is that once created, it would be easier for future warrants to ask Apple to simply re-perform the same trick it's done in the past. Apple's core argument is that allow this once opens the door to doing it repeatedly because right now Apple doesn't have the toolchain to do this. Once the toolchain exists, its deployment is trivial.


So? At least in American jurisdiction, the 4th Amendment doesn't guarantee the right to unbreakable crypto. It says:

"The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."

Upon probable cause, the government may issue warrants. The US government, backed by the people of the United States, have a right to compel a private corporation to comply with reasonable searches and seizures on probable cause with a warrant.

The government is not asking Apple to deploy nuclear weapons, nor ship all iPhones with a hack. They are specifically asking for help with one vulnerable phone, an iPhone 5C. They may be asked to do this multiple times, but they can keep whatever engineers and tools they use internally private.

I mean, let's get real for a second. The toolchain already exists. Apple has the source code, hardware simulators, debugging harnesses, and the original engineers. There's no magic. As long as those things exist, the danger of a hack getting public is real, especially if the source for iOS is ever stolen, or one of the core engineers goes rogue. If Apple's own internal security can't keep a more polished tool under wraps, they won't be able to keep the subcomponents of it under wraps.

There's a reasonable middle ground between "government has a backdoor and can scan and read everything" and "it's impossible for the government to even obtain legal warrants on probable cause for a very targeted piece of information" What's being discussed in this case is not Snowden-level drag-net snooping. We're not even talking about a wire-tap. We're literally talking about the government finding a Safe/Vault inside the house of a murderer, and talking to the manufacturer of the Safe/Vault to get them to pick the lock without destroying the evidence inside. The Vault-maker in this scenario doesn't even have to give over the blueprints of the proprietary lock mechanism, they just need to open this one vault.


>I mean, let's get real for a second. The toolchain already exists. Apple has the source code, hardware simulators, debugging harnesses, and the original engineers. There's no magic. As long as those things exist, the danger of a hack getting public is real, especially if the source for iOS is ever stolen, or one of the core engineers goes rogue. If Apple's own internal security can't keep a more polished tool under wraps, they won't be able to keep the subcomponents of it under wraps.

>As long as those things exist This is false. Apple could hand you all the things you mentioned and you still wouldn't be able to break an iPhone 5C. You would still need Apple's master private encryption key.

As you are framing the question, is, the government should force Apple to hand over their private encryption keys. If thats so, should citizens in other countries be wary of the fact that their data stored in Apple servers are privy to the US govt? Or should Americans be worried that China can coerce Apple to hand over encryption keys?

In terms of global politics, The vault maker in this scenario has to worry about other strong men asking for such a master key when they need to hunt down gay men or something as trivial, once they realize this is possible.


No, I'm saying Apple shouldn't hand over their encryption keys. I'm saying the FBI should hand over the iPhone, and Apple hands back the files, but doesn't give them any hacked phone.

In a paperless world, and unbreakable encryption, what is the point of warrants or regulations at all?

If a company that say, committed crimes, financial or criminal, has a warrant served on them, what if the response is, "Hey, we'd love to give you our emails, but all employees use end to end encryption, and every desktop has unbreakable filesystem crypto, and our IT department can't unlock anything, so you must compel the users to hand over keys?"

Can that be a defense against all warrants and crimes? If politicians are suspected of accepting bribes with strong probable cause, do we simply accept that their phones and email communications with lobbyists and corrupt bribers can't be accessed?

What does society resort to then, rubber hose cryptanalysis? Imprisonment on lack of evidence until they turn over the keys?

Transparent democracy is on a crash course with cryptoanarchy. The same people who are chanting for absolute unbreakable cryptography are some of the same people supporting Bernie Sanders and would rail against offshore Cayman or swiss financial obfuscation by the mega rich.

If we want non-corrupt government and industry, we need a way to investigate serious crimes. In the past, this meant seizing papers, letters, and records under warrant. Nowadays, it may be possible for the entire digital crime trail to be unbreakable with no recourse except catching people in the act. However, when the FBI entraps people with sting operations "in the act", civil libertarians decry that too.

So how do we police the bad? If you look at many third world countries with trouble advancing, a lot of is due to corruption. Is the danger of the government subpoenaing your email worse than the danger of tens of thousands of corrupt businesses spreading financial risk all over the economy and political system?


>If a company that say, committed crimes, financial or criminal, has a warrant served on them, what if the response is, "Hey, we'd love to give you our emails, but all employees use end to end encryption, and every desktop has unbreakable filesystem crypto, and our IT department can't unlock anything, so you must compel the users to hand over keys?"

Are you an American or Foreign? In the American constitution, the 5th amendment, legally protects a party from being forced to incriminate one self. So if you are still alive, and slapped with a warrant, your rights protect you from giving up your private key.


I'm aware of the 5th amendment, but prosecutors have worked around it and journalists have gone to jail to protect sources. http://www.rcfp.org/jailed-journalists

Consider the various ways to police crime:

1: Before the fact, preemptively. Active surveillance and-or entrapment. Widely criticized.

2. After the fact, forensic analysis. Previously, physical evidence collected, warrants for documents. In digital realm, foiled by cryptography. Attempts by government to restore status quo to pre-digital capabilities widely criticized.

3. Compulsion. Prosecutors lean hard on individuals with digital evidence to turn over materials. Runs afoul of 5th amendment and civil libertarians.

At least for many types of crime, especially white collar crime, this leaves the authorities almost no recourse. Your politicians can communicate securely with their paymasters, and receive untraceable payments over bitcoin. Although you may find HUMINT witnesses who can give you probable cause, there may be no way to obtain real evidence.

The Silk Road founder was only caught because of active surveillance, literally caught him with his computer unlocked. This would be like waiting for the San Bernardino killers to unlock their iPhone, and them seizing it before the auto-timer relocked the screen. Not exactly possible for all types of crimes.

Is active physical surveillance of suspects by the state any less creepy than digital warrants?


> The problem is that once created, it would be easier for future warrants to ask Apple to simply re-perform the same trick it's done in the past.

That's not a problem at all. The issue is how the existence of the tool affects warrantless access.

Where did we get the idea that it's bad in itself for law enforcement agencies to be able to break crypto when they have a warrant?


Because breakable crypto isn't crypto?


Outside the all writs act, parties generally can't be compelled to perform activities by the court that they don't already perform in the course of business. So this is significant (warrant or otherwise) because it's the court asking apple to create a process de novo that didn't exist before.


As far as I'm aware, the most proper attacks here are, in order of cost:

0) Find some errata. Apple presumably knows as much as anyone except NSA. Have plausible deniability/parallel construction.

1) OS level issues, glitching, etc. if the device is powered on (likely not the case). Power stuff seems like a particularly profitable attack on these devices.

2) Get Apple, using their special Apple key, to run a special ramdisk to run "decrypt" without the "10 tries" limit. Still limited by the ~80ms compute time in hardware for each try.

(vs. an iPhone 5S/6/6S with the Secure Enclave:)

3) Using fairly standard hardware QA/test things (at least chip-level shops; there are tens/hundreds in the world who can do this), extract the hardware key. Run a massively parallel cluster to brute force a bunch of passphrases and this hw key, in parallel. I'd bet the jihadizen is using a shortish weak passphrase, but we can do 8-10 character passphrases, too. They may have info about his other passphrases from other sources which could be useful.

While I'm morally against the existence of #3, I'm enough of a horrible person, as well as interested in the technical challenge of #3, that I'd be willing to do it for $25mm, as long as I got to do it openly and retained ownership. In secret one-off, $100mm. I'd then spend most of the profits on building a system which I couldn't break in this way.


I really hope they actually physically can't access the data on this phone. It's entirely possible this could be the case -- I've been trying to consider the vectors they could use:

- lightning cable delivered iOS patch (probably won't work because iOS won't negotiate over USB until you tap a dialog box)

- OTA update (not connected to internet)

- Cracking open the device and accessing the storage directly (encrypted until boot time)

The most likely vector I can think of:

- - Lightning cable delivered iOS patch from a trusted computer (i.e one that the terrorists actually owned)

It's quite impressive that Apple is taking a stand like this, though perhaps unfortunate timing WRT the larger encryption debate.


If you look at past cases where the All Writs Act has been invoked, the Courts have rejected this type of government conscription.

Effectively, the government is forcing Apple to take receipt of a device that it does not own or posses, then perform potentially destructive services on a device, and then perform services that could potentially require Apple to testify at a trial under the Confrontation Clause of the Sixth Amendment.

I really think that Apple's in the clear here, and the AUSA's in the case are pulling all the stops to get Apple ordered to break the encryption.


The fact that they can create this backdoor, doesn't that mean it already exists?

What Apple needs to do then instead of writing this letter, is release an update that closes this backdoor.


The letter describes it:

"The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer."


> The fact that they can create this backdoor, doesn't that mean it already exists?

Quite likely.

As I posted on the other discussion here: https://news.ycombinator.com/item?id=11116343

> If it's possible to make such a "backdoored" build of iOS, then there are state actors who will be throwing $Millions at doing it already, with or without any willing help from Apple.


The security here is Apple's signing key that is used to sign updates. If you believe RSA is safe (which I assume is being used) and the key has a reasonable length, throwing a couple of millions at it won't bring you much.

I guess what the FBI wants is a backdoored iOS version and to have Apple sign it with their signing key (which means that the FBI can use it over and over again).


That is hard, we know. But it is not impossible for those with time, resources and willingness to think outside the box.

For instance, signing keys can and have been stolen, on the principle of "if you can't brute-force it, hack in and take it".

http://arstechnica.com/security/2013/02/cooks-steal-security...

http://blogs.adobe.com/security/2012/09/inappropriate-use-of...

http://www.androidauthority.com/ssl-added-removed-google-moc...

http://arstechnica.co.uk/tech-policy/2016/02/its-legal-for-g...


That is hard, we know. But it is not impossible for those with time, resources and willingness to think outside the box.

I assume that Apple has a hardware security module for key generation and storage, perhaps even custom-designed and built, to prevent key extraction/copying.

Of course, in the end you have to trust Apple that only a limited number of employees have access to such hardware, that they have proper auditing procedures, etc.

But the probability of a compromise can never be 0, unless you design and produce your hardware yourself, write all code that this hardware runs yourself, and never leave your computing device unattended. Since that is not practical for most people, you have to put trust in some third party.


If there's one thing that we have learned over the last few years from Snowden et al, we have learned that it is safe to assume that these state actors will be trying all the avenues that you or I can think of, and spend years discovering new ones that we have not thought of.


I have trouble understanding your point. What's the alternative to trying to minimize the probability of crypto system compromises?


You make a good case that Apple are far ahead as industry leaders here; that seems solid. It's less solid that this means they are impervious, or that the thinking "I can't see any holes in this process, therefore there are no holes in it" is sound.


That is worth doing; but I still find it quite likely that such a backdoor is existent or doable.


This. Is Cook's letter an apology for what they already had to do (but could tell no one about)?


[deleted]


Nope. Once this option will be there, a single-button iPhone cracker box with lighting connector will be available on aliexpress in cca 3 weeks.


It's reducing the complexity of decryption (from O(∞) to O(2^n)), so it's clearly a backdoor.


So what would you call it, a back gate? It still loosens the extant security measures, facilitating compromise. That's pretty backdoor-y in my book.


I don't have an iPhone so correct me if I'm remembering correctly but aren't they by default protected by a 4 digit numeric pin? A 4 digit numeric pin that a brute force attack can be used on is effectively no security/a backdoor imo.


There are escalating time limits on incorrect PIN attempts, which is also enforced in hardware by the Secure Enclave in A7 chips and above. This would mean even breaking a 4-digit pin code without that delay being removed would take a long time. Additionally the device may be set to wipe after 10 incorrect attempts.

Attempts | Delay

1-4: none

5: 1 minute

6: 5 minutes

7-8: 15 minutes

9: 1 hour

Source: https://www.apple.com/business/docs/iOS_Security_Guide.pdf


Secure Enclave is _not_ hardware; despite being isolated from the main OS and CPU, it is still software-based and accepts software updates signed by Apple.


I believe the default on new installs is now 6, but options for a more complex PIN, either numeric or straight up alphanumeric, are available.


I use an alphanumeric password on all my iOS devices


I've never been an Apple fan but this was a fantastic and bold move by them. Software security and hacking is already an enormous problem that every single person has to deal with. Even major companies like the NYTimes have been hacked by malicious users in the recent past. We need to take every reasonable action to combat this threat. Building deliberate vulnerabilities (yes, every backdoor is a vulnerability) into our software and devices is going to make all of us less safe, and all of us more vulnerable to unforeseeable attacks in the future.


Apple is doing the right thing, the American way, which has been forgotten, that puts freedom over security. Thanks Apple, don't give in. Their 1984 anti Big Brother superbowl ad has finally come to fruition[1].

[1] https://www.youtube.com/watch?v=VtvjbmoDx-I


I think, as a society, it boils down to this: "And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."

Can a private, for profit, company deny the will of an elected government working to solve a heinous crime based not on what they say they will do but because they cannot give a 100% guarantee that this is the only time/way it'll be used? Apple acknowledges that the government is saying it's limited to this case but because there's no guarantee (100% certainty) they feel they can deny it?

If yes, what does that mean as a broader precedent. Are we comfortable with private companies denying an elected government based not on what they agree to, but instead because there's a chance it'll be used in other ways?

As terribly flawed one might feel about government very few would think it has less accountability than a private company.


I see your point. Defying a order from a democratic government is a big deal.

But Apple isn't saying they wont do it. They just want to make it really clear that they're not happy about it and they want the government to change their mind.

The government still has a monopoly on the legitimate use of force.


```Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government. We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country```

I read into that that they are in fact fighting this? And I agree, I think it's great that Apple is forcing this discussion into the public domain. These are key issues for us all.


Apple's encryption appears to be done in such a way that government entities can safely use them as well as "consumers." But what may happen, is that Apple will be force to produce 2 kinds of iPhones. One for consumers, with strong encryption, but a "backdoor" for warrant "cough" based access. 2nd type of iPhones for government use (string encryption, no backdoors)

They may already have this in place now, but what we are seeing now is a show. They are testing how people/consumers are going to react to this situation. Out government probably figures that nobody will care in the end.

In the USA, we have lost our liberty. It's time to wake up and see what is happening. It's getting worse & the people within our government are working hard to enslave us even more.


This is an interesting chapter in the "Tim will never be Steve" saga that so many people are infatuated with.

This particular hill that Tim Cook has decided to defend is as important as anything Steve Jobs ever did at Apple.


This is just huge hypocrisy and full of lies. First of all, Apple CAN attempt to brute force the password. Compiling whatever new firmware is needed and signing it with their keys will not introduce any new backdoor like they claimed and lied to the public - the backdoor is already there, and it is their private keys. Just like that "backdoor" somehow end up at some bad guy's hand, so could their private keys.

I would agree with Apple if they wanted FBI to pre-submit all their guessed passcodes for brute force for apple to try, and for apple to have the sole responsibility for that, so that getting said "backdoor" (which really is nothing more than a door handle) will be as hard as getting their private keys, and governments will not keep the said backdoor in their hands. I would also agree if Apple claimed they don't want to be able to crack devices at a judge's order (although that would be against the law - so they can't claim that).

But this is NOT what Apple said. This whole letter is just one big PR bulshit. They CAN brute force a passcode. They failed enforcing significant delay incurred when failing a passcode attempt - even tho this issue was already known for YEARS (will give citation if needed) when apple designed the discussed iPhone 5C - and they also failed requiring passcode to update the device. They already have their convenient backdoor in place in the form of their private keys.


> This is just huge hypocrisy and full of lies. First of all, Apple CAN attempt to brute force the password. Compiling whatever new firmware is needed and signing it with their keys will not introduce any new backdoor like they claimed and lied to the public - the backdoor is already there, and it is their private keys. Just like that "backdoor" somehow end up at some bad guy's hand, so could their private keys.

Thank you! This is the most important technical point about this whole thing. All the talking about the SE (fascinating as it may be) is irrelevant. All strong crypto that is based on a private key being kept in a secure vault at some corporation does have a backdoor. The keeper of the key can be compelled to use it to sign something.

This is exactly why this would be such a dangerous precedent. Government giving software specifications that are signed with a vendors public key. In this case, it's a one off but it's a step into the direction of "upload a screen-shot of the phone's display every minute to ftp.nsa.gov with your next iOS update". And no SecureEnclave will protect against that. It will just be a OS update signed by Apple.


If I understand correctly, any piece of software that would be used would need to be signed by Apple. Furthermore, the FBI's warrant(?) says specifically that it would only need to work for one device ID. Thus it would be relatively straightforward to create an update for the FBI that could pretty clearly only be used on the phone in question. Unless the FBI had Apple's signing key they could not reuse the software (assuming they couldn't break the bootloader chain of trust, which they apparently cannot if this is the route they are taking). Capability is not the grounds on which they are arguing.

This is a very clearly political refusal. Apple is saying that about as explicitly as they can in this message. Whether or not they can do it, Apple doesn't want to be caught in the game of being a government surrogate or having to determine for themselves if government requests are legitimate (imagine, say, if the Chinese government asked for data from a dissident's phone - would Apple want to risk that market by denying a request that they have complied with in the US?). It's unfortunate for them that the FBI is making this request while people still own phones like the 5c for which they could theoretically disable security features, as opposed to the newer phones which it is possible they are completely unable to defeat.


> would be relatively straightforward to create an update for the FBI that could pretty clearly only be used on the phone in question

How? Wouldn't the FBI reverse engineer it?

It seems like apple would have to update all phones to a new version (which would prevent the exploit provided to the FBI) before handing it to them.


It's not that reverse engineering is difficult - the FBI could theoretically do that now. It's that it needs to be signed to be accepted by the phone.

Edit: see, for example, here: https://stratechery.com/2016/apple-versus-the-fbi-understand...


I know that because this is on hacker news, everyone is talking about whether it's possible to access the data, and if possible... how easy or how hard it is. But the focus should be on the whether there should be a clear line/understanding between security and privacy, or should we keep everything black and white as it is now, just looking at extreme cases?

If they cannot co-exist, I'd rather have more security and less privacy. But ideally, I shouldn't have to choose between them.


I don't really see how you propose to separate these though.

If you have information about something you're planning that harms my personal security, there's always going to be a necessary tradeoff between your privacy and my security.


Link to the FBI order: https://assets.documentcloud.org/documents/2714001/SB-Shoote...

(Edit: deleted part where I was wrong. Thanks robbiet480 for correcting me. It's 2am here and I was tired.)

Also, prediction: if Apple refuses to build a brute forcer, someone else will do it and sell it to the FBI. Just wait and watch.


iPhone brute force hardware already exists [1]. The issue is that when Touch ID and/or a passcode is enabled, the device locks itself for a few seconds-a few hours every time an incorrect pin is entered. So brute forcing would take an extremely long time.

In addition, there is a setting on all iPhones to erase data after 10 failed pin code entry attempts.

The FBI wants Apple to provide a custom iOS build that can be installed on the device that allows for remote (over the network) brute forcing with the increasing timeout/erase data protections totally disabled.

1. http://techcrunch.com/2015/03/19/iphone-bruteforce-pin/


But how can the custom iOS build be installed on the device while it is locked?


"The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery mode, or other applicable mode available to the FBI."


DFU/Recovery wipes the device.


You are of course correct. I edited my comment.


Also disables the auto-lockout on entering the wrong PIN enough times, as well as the delays after the 3rd (IIRC) incorrect attempt.


>Also, prediction: if Apple refuses to build a brute forcer, someone else will do it and sell it to the FBI. Just wait and watch.

Just made a downvoted to death comment on this very same thing. This is literally a government creating a market situation.


If you oppose this, please let President Obama know. The FBI is part of the executive branch of the government, and as such, directly reports to the president. In other words, if he tells them to stop, they must comply. Please register your complaint here:

https://www.whitehouse.gov/contact

Here is my letter to them:

Dear President Obama,

I've voted for you in both elections, and have been a firm supporter on all your causes (affordable care act, and more). However, your FBI has clearly overstepped it's authority by demanding that Apple spend engineering resources building a software product that can break the encryption of a terrorist's iPhone.

Seriously, you need to stop this. You are the head of the executive branch of the government, of which the FBI is directly underneath your jurisdiction. Director James Comey is directly within your chain of command.

What the FBI is asking for is a master key to be created that can decrypt any iPhone. This makes all Americans with Apple devices insecure in the face of threats to our personal security and privacy. I hope you can understand that this is clearly unacceptable, and needs to be stopped.

I want to register my complete opposition to the FBI in this circumstance. Please stop this.

Thanks, xxxx


>Seriously, you need to stop this. You are the head of the executive branch of the government, of which the FBI is directly underneath your jurisdiction. Director James Comey is directly within your chain of command.

You don't need to tell the man what he can do in his capacity as president, he knows what he can do. Identify the issue clearly and concisely, but don't try and tell him what he can and cannot do. Whether you intended that or not, that's how it comes across.

> What the FBI is asking for is a master key to be created that can decrypt any iPhone.

Not true, they're asking for a way to submit guesses electronically and removing the auto-wipe and 10 guess limit. If you are going to request help on an issue, ensure you know what you the issue is.

I wouldn't send this message in, and I'd suggest you restructure what you're writing because this is simply not effective, and if other people begin sending the same message in it's going to make everyone who opposes this issue look misinformed with a superiority complex (that is how the entire second paragraph comes across).


The important thing is to voice your disapproval. I'm confident that my wording could be improved, but it's important that Obama quantify the sheer number of US citizens that disagree with this despicable act.


The important thing is to voice your disapproval in a manner that a) identifies the issue and b) identifies your stance on the issue. Your comment misses the mark on a) by being down right wrong, and your stance on b) comes across as you believing you know more about these matters than the people executing them. The fact that you come across that way without having a clear understanding of the issue is going to lead them to dismissing your disapproval.


The important lesson here is that it is time to design the next phones in a way that makes it impossible to either install a software update without unlocking the device or implement auto-erase functionality in hardware.

That way for future phones at least, the issue would become moot: there would be no way for Apple to build and/or install a custom software image that allows brute-force password cracking.


Already there with 5S and above.


I'm no security expert, but how would Apple access previously encrypted data with a different version of iOS? Doesn't having that ability imply they already have a "back-door"? Could someone explain what I'm missing here or is it more that that would be a one-off solution and the FBI is asking for a global, remote, no apple needed solution...


If you read the letter carefully, you'll notice it talks about brute-forcing the password. I guess the assumption is that the password is brute-forceable, but the FBI is unable -- or unwilling -- to write their own software to do it. This seems surprising to me; you'd think the FBI could employ a few hardware hackers for this. Maybe they are only pushing the issue for political reasons -- actual jihadist terrorist attacks on US soil don't happen very often, better milk it for all it's worth?


I remember the old dead community of OpenSource Phone - http://wiki.openmoko.org/wiki/Main_Page


if you are interested in the technical details of iOS security: https://www.apple.com/business/docs/iOS_Security_Guide.pdf


Cannot be more glad to see Apple's stand on this. Let's not forget what happened with Blackberry some 5 years ago. India, Saudi Arabia and UAE got monitoring ability on its platform:-

1)http://timesofindia.indiatimes.com/tech/tech-news/telecom/Go...

2) http://www.reuters.com/article/us-blackberry-saudi-idUSTRE67...

3) http://www.thestar.com/business/2010/08/16/threats_of_blackb...


No sympathy for terrorists, no sympathy for weakening encryption.

I can understand someone outside of tech not understanding how those are comparable statements, but if anything the latter is more important.


OT but this post is well on its way to becoming the most popular post since Steve Jobs died.

https://hn.algolia.com/?q=&query=&sort=byPopularity&prefix&p...


Would this really be true or is this just a decoy, to let you believe there is no backdoor?

I do believe there is no backdoor for when a city court requests it, but i don't really believe that the FBI or CIA doesn't have access to it.

Considering that iPhone already exists a long time, they must have some means to backdoor the "iCloud"...


Massive props to Apple, again I am impressed by their commitment to customer privacy.


If I were Cook, I'd draw a line in the sand. If we are force to comply, we exit the phone business, because we won't make phones that compromise our customer's security.

But that would take more balls than anyone left here in this "Land of the free and home of the brave" seems to have left anymore.



And, his letter is still posted on his web site, describing what he went through:

http://lavabit.com/


Thank you, and I'm proud of Lavabit. They're at the top of my Balls list now.


Okay so ignoring the fact that this would be insane from a business perspective.

It also doesn't make sense from a game theory perspective, if you're a Good Company that's going to fight on this issue, it makes sense to be in business as long as possible to be a pain in the rear for authoritarian jerks. If every company who was willing to stand up to these guys went out of business immediately after you'd only have people who aren't willing left over.


It's no longer a business issue, and if it is, then Apple is just posing.

Apple's positive affect on our economy, our technology, and even our nation, would make such a line that our government would have to think long and hard about pursing their demands.

Imagine if Apple, and the tech community as a whole, stood behind that decision. Of course they won't because they rather be rich than free.


>Imagine if Apple, and the tech community as a whole, stood behind that decision.

You didn't counter his point. Again, its insane from a Game Theory perspective, it would require all players in the tech community to be effective, whereas if one person defects he now has access to a huge market where all the incumbents are gone.

Apple doing this would just mean "someone is going to create backdoor'd phones and capture the market, just not us", and we would get nowhere.


Yes, it would be awesome.

> they rather be rich than free

But don't most people want to be rich so that they can be free?


The board of directors would string him up. And if they didn't the stockholders would. No way would apple give up the phone business.


Apple could also put the software under a free license. The community could remove any backdoors that Apple is required to put in. They could still sell the hardware if they wanted to keep making money, just like Google does now.


He'd be fired immediately.


> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

They really need to put that paragraph closer to this one:

> The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The first paragraph without the second implies that iOS isn't actually secure at all.


I see a lot of discussion about "Secure Enclave" and other hardware security features and such, and I'm not sure I see the relevance. Assuming that the data has already been properly encrypted, stored on disk, and purged from memory (by shutting down the phone) by a version of iOS that did not already contain a backdoor when the data was encrypted, there's no magic combination of hardware and software that can decrypt that data without the password, right? This seems to be supported by Apple's claim that the best they could possibly do is provide a channel for the FBI to brute force the password.

So am I missing something that makes the iPhone's internal security architecture relevant here?


Yes. The target phone, an iPhone 5C, lacks a Secure Enclave.

The password retry delay, and subsequent deletion of keys, is enforced by iOS here. Apple could provide some kind of software to allow for unlimited attempts (and an interface to do so in an automated way, which the FBI is specifically asking for).

On newer phones, the Secure Enclave contains the keys, and enforces both the retry delay and the deletion of its contents. There isn't a way around this without also upgrading/flashing the SE system (and it isn't even clear if Apple can do this in a way that preserves the keys).


Ah, I see, so with newer hardware it wouldn't even be possible for Apple to enable brute forcing with a software update. But even with older hardware, presumably allowing brute forcing is still the worst they can do, right? (Assuming no prior backdoor exists.)


I think it's outstanding that Apple is standing up for this.

Will they, can they do anything about data in iCloud as well? While you can turn off iCloud I'd guess the majority of people are using it. Given you can access much of it at iCloud.com that would seem like whether or not you can unlock an iPhone most customers' data is available directly from Apple. Mail, notes, messages, photos, etc. No idea about other apps data that get backuped

Again I'm applauding Apple for standing up for encryption. If they could some how offer the same on iCloud I'd switch from Google (Google is not encrypted either. My point is I'd switch to a service that offers it)


Apple does deserve the respect their getting for standing up to the government about this. They're absolutely right that this is an attempt to fatally undermine security for a whole host of devices, and sets a disturbing precedent.

What do find interesting, is that Apple isn't the first manufacturer that the government as ordered to crack a device. An "unnamed smartphone manufacturer" was ordered to crack the lock screen on October 31, 2014.[1] No one made a fuss then, so someone caved.

[1] https://en.wikipedia.org/wiki/All_Writs_Act


It makes sense for them.

If they put a backdoor in iPhone for US government, they are effectively thrown out of Chinese market.

Interesting enough, what will Apple do if Chinese government demand they to decrypt/put backdoor in exchange of staying in the market?


I was about to mention the Chinese case, the Chinese government asking foreign companies to install means of control and access in their products.

Does that mean that apple will not provide such means or disable security for phones sold on the Chinese market? That would be surprising given the potential size of this market.


They will probably produce some China-only version to please the government. A open letter like this, will instantly end their relationship with the government, they won't even attempt to do it.


I thought they already did this, but I have no proof. It seems reasonable because Google got evicted from China for not playing ball, so I find it hard to believe that Apple would get a free pass.


I don't know if they'll get a free pass or not, but Apple is a lot more popular among the Chinese youth/middle class/intelligentsia than Google. They may have a bit more sway.


Just so you know when I forgot the password to my password to my iPhone remembered that I choose 1 from the top section 2 repeated numbers from the row below 1 below that and then a zero. So I tried it untill it said "iPhone disabled connect to iTunes", this is when I found out you can reset the disabled time by clicking on a computer the backup button. So therefor you could successfully create a program that tries passcodes 5 or 10 times then tries and fails to back the iPhone up. (There is no need for a backdoor or anything fancy)


A phone without a backdoor would be illegal in the UK once the Snooper's Charter comes in to full effect. I'm very interested to see how the UK government will react to Apple's stance.


The All Writs Act is a United States federal statute, codified at 28 U.S.C. § 1651, which authorizes the United States federal courts to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law."

well, as far as I can see, it is not agreeable to the use and principle of law to force a company (or a person since corporations are people) to spend money and waste resources to compromise its own security systems, which happens to be something they morally object to.


  While we believe the FBI’s intentions are good, it would
  be wrong for the government to force us to build a 
  backdoor into our products. And ultimately, we fear that 
  this demand would undermine the very freedoms and liberty 
  our government is meant to protect.

  Tim Cook
Kudos to this guy for standing up to an idea.

Now on practical notes, this is about security, providing a digitally secure platform to both users and providers, prevent tampering, keeping data secure.

Microsoft could take a cue.


The court order says Apple should make the software only work on the specific phone in question. Nobody could modify the software to work on other phones any more than they could make the changes to iOS themselves.

Apple is misrepresenting the situation and perhaps it's because they're afraid that in the future the government will come knocking again, but I think it hurts them to not be completely above-board about this.


That's not really possible. There isn't a mechanism to create this condition: "Nobody could modify the software to work on other phones"

Tim Cook is right. Once this is unleashed, there are no limits and all iPhones are insecure.


Thanks for the heads up about these details. As you mention if you agree to one request you agree to all requests plus proving you can do it. A nice side effect of agreeing to this request would be to put an update that closes all the loops that allowed to perform the request on first place.


I'd love to know the names of the people within the FBI who are pushing this agenda. The only way this foolishness is going to stop is if those people are out of a job.


I'm sure this backdoor request is fully supported by Obama as well as Hillary.


I don't quite understand - what is the actual purpose of being able to push a new version of iOS while locked? Apple don't seem to use this - people stick to whichever version they're comfortable with on old devices and accept whatever limitations.. so why does the functionality even exist?

Even with the restriction of being plugged in, outside of Apple who needs to push iOS versions at tethered devices and will be hindered too badly by having to unlock them first?


This is the only acceptable response.


Even if Apple created a backdoor, how are they going to install it on locked phone? Are locked mobile able to update without access to internet or user passcode?


I'm curious: is it likely that Apple was under a gag order regarding the backdoor proposals/discussions?

I've always wondered why large tech companies/corporations abide by such orders instead of speaking out. Even if Apple was under a gag order, they've created a PR nightmare for the alphabet agencies; Apple could be pursued in court, but that pursuit would now likely be done in the face of negative public opinion.


Why is there very little talk about the First Amendment in this whole discussion? They are asking to write custom software.

The supreme court has ruled in separate cases that: 1. that software is speech 2. that a person (corporations are people according to them) cannot be compelled to speak

It would seem to me that the FBI could perhaps subpoena technical documentation from Apple but it should be required to hire their own developers to write this software.


The easy solution to this is to have the gov send Apple the phone. They break into themselves and then hand back the phone with the pass code turned off and whatever software they need to install to do it removed, leaving no trace of how they actually did it.

Win/Win

No software backdoor is created, the FBI gets its data and we all go on with our lives. Why are we spending so much time gnashing teeth over something that has a very simple solution to it?


Chain of custody. The guv would need to be able to prove, in a court of law, who touched it and when, for what purpose, etc. Handing it off to Apple in good faith would make it inadmissible as evidence in a trial.

You could argue that Apple could allow an official to be present while work is being done on the phone, but then Apple risks having it's (to-be-developed) method being viewed and/or captured by said official. Very risky.


A possible compromise would be to add a backdoor to the security module that would unlock the phone in exchange for a proof of work.

It would be relatively easy for the chip to offer a challenge and accept, say, a $100,000 proof of work to unlock the phone. This way, we prevent bulk surveillance but still allow the government to access high value targets' devices.


Backdoor is somewhat of a misconception. What they want are two front doors, ie we encrypt your message with the recipients public key, and we make a copy with our(in this case apple's) public key. We send both messages over the internet, and apple or your isp/cell service provider (we can also assume nsa prism has it too) stores the apple key'd message or both. When the government wants access, they can issue a subpoena for information from the isp/cell provider for the encrypted data (or just download it from Saratoga Spings), then they issue a warrant to apple to decrypt it with their private key. This is likely the only reasonable and responsible outcome that I can see resulting from this debate. Or, pessimistically it becomes an issue for political fodder and we leave it up to politicians who have little to no understanding of the technology to devise some technologically inept solution.


What's the potential fallout on this case? I assume Apple is appealing the ruling - what happens if the ruling is upheld and Apple refuses to comply (unlikely IMO, but what if)? Could the DOJ target individual Apple engineers and order them to do it or face contempt of court charges?


Yeah, even if the DOJ ordered Apple the corporation, ultimately they would still have to force one or more Apple engineers to perform the task. That seems messed up. Would the Apple engineer at least have the option to quit?


> Yeah, even if the DOJ ordered Apple the corporation, ultimately they would still have to force one or more Apple engineers to perform the task.

More likely, they would impose consequences against Apple-the-corporation were it not to cause the task to be performed, rather than forcing an Apple engineer to perform the task.


Their iMessage encryption is fascinating. It basically makes it impossible to retroactively decrypt iMessages. With a court order, they can start MITMing conversations, but unless they intentionally generate a MITM keypair they are cryptographically locked out of the conversation.

http://techcrunch.com/2014/02/27/apple-explains-exactly-how-... (Link to Apple's paper is in the article)

(Yes, Apple could add this key for everybody at the beginning, but if their intention is security then it is a brilliant system.)


I hope apple employees + executives read this:

I am now officially, an Apple fanboy. That's right, I'm gloating to family and friends, about how Apple is standing up to the man, doing the right thing, and refusing to compromise their security.

Keep up the good fight.


> And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

Good one Tim! I mean how long did the LE think they can abuse constitution, put spy devices on people's cars without warrant, use stingrays and do all sort of other crazy stuff including planning and executing white-flag attacks without any consequences whatsoever?? I mean, at some point, we the people - for a good reason - will lose all and any trust we have in them! And that's what Tim is saying in this one sentence that with overwhelming evidence, the US Gov would have hard time arguing against!


Huge respect to Tim Cook for standing up for the personal information security of Apples users around the world. When a non tech demands something as stupid as a back door, they do not acknowledge how weak they make data security.


Privacy is obviously the foremost issue at hand with the Government's request here, but there is also a huge potential impact on the future of the iPhone software. There is a huge difference between granting access to a user's data at the Government's request vs demanding a customized build of the iPhone's OS. Imagine the long-term implications of having a third-party tether its misaligned feature requests to every OS update that the iPhone makes. What would be the continued relationship with Apple and the agency behind this? Would this evolve into something analogous to HIPAA compliance?


> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The scary part here is that the iPhone data is really not that secure. If apple can overwrite the OS and get access to the data, this means the keys are stored on the phone somewhere, and not password protected, or "fingerprint" protected.


Given the way a lot of people (and the media) tend to go completely bonkers when somebody says "terrorist", this is commendable.

It remains to be seen, though, what Apple will actually do, in legal terms. Will they flat-out refuse to cooperate, even if this means that they will be fined or Mr. Cook will be imprisoned for contempt or something like that? Will they actually send their lawyers to challenge the court decision? That would be very interesting to watch, and if they succeeded, it would create a precedent for a lot of other companies. But so would their failure.


To play devil's advocate:

Mr. Cook expressed concern that "the government could intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge".

As I read this I wondered, "what harm would actually happen if that occurred"? If the government did read my messages and get my health records & financial data and track my whereabouts, I can't think of anything bad that would actually happen as a result of that.

Is there anything specific that I should be worried about in that scenario?


"Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say"


I feel like Apple is intentionally over simplifying it for the purpose of this letter or maybe to push back on the FBI ask more easily.

Apple could propose to secure access to the FBI using the same level of security that it uses to protect the access to the phone content for the owner of the phone himself. Tim Cook only talks about one solution of a "tool" that it could install.

If the same level (and method) of security is used then saying that there is a risk of the backdoor being hacked would be equivalent to saying that there is a similar risk of the user access being hacked.


I dont know if there can be a practical solution based on what you said. If an FBI agent could do it, he could lose that methodology due to him getting hacked or he could do that on purpose with malicious intent( theoretically speaking).


Wouldn't one assume that once the phone is powered up there is some kind of code at startup or scheduled that would query an apple update server about updates,fixes,etc. At that point it is reasonable that a company such as Apple would force certain updates into the phone whether the customer wanted that or not? All Apple would have to do is direct the phone to a phoney update site(for this IMEI only)containing code that would dump RAM to an outside server. No other phones would be affected and the data would be retrieved. World saved!


I wonder how much of that was personally written by Tim Cook, vs. various other people within Apple (I'm sure legal, PR, product, etc. all had input, but this feels like something he wrote himself.)


This is probably the reason why someone else wrote it. Because writing in "your voice" but without filling words is incredibly hard. But then, it's not the CEOs job to be a good writer.


It absolutely is the job of the CEO of the world's (second) most valuable company to be a good writer.


Being a clear thinker (role of the CEO) and being a concise writer are two different things. The latter requires a lot of training. The CEO of the world's (second) most value company must be able to clearly articulate his goals and his vision, but he doesn't have to be a wordsmith to put his vision on paper.

Writing != thinking != talking

Writing might make you a better communicator in general, maybe a better thinker, too. But clear thinking and talking don't make you automatically a better writer.


I think CEOs (and even senior managers) end up communicating in writing to most of their teams most of the time, so clarity in written communication is essential to top tier people.


While I think protecting user data is important, I don't understand what the fuss is about. Anyone could (given technical knowledge + tools) take apart a phone, pull the encrypted data out of storage, and then brute force the encryption on a large machine.

The FBI doesn't need the modified iOS code, and that Apple write/not-write it doesn't change anything in the end, since someone else could just as well write the software with some reverse engineering.

[edit: if you downvote because I'm wrong, please explain because I'd love to know why]


Wouldn't one assume that once the phone is powered up there is some kind of code at startup or scheduled that would query an apple update server about updates,fixes,etc. At that point it is reasonable that a company such as Apple would force certain updates into the phone whether the customer wanted that or not? All Apple would have to do is direct the phone to a phoney update site containing code that would dump RAM to an outside server. No other phones would be affected and the data would be retrieved


This is ugly. If Apple can indeed break into the phone, they need to say, "We have to stop production now. All of our engineers will need to be behind this. It will cost us at least a billion dollars, if we can do it. We will miss deadlines for new products and software. Write us a check for $1 billion and we will start on it. We may need a few billion more. Write the check - we'll do what we can do. And lets hope we don't accidentally destroy the evidence doing it."


It's almost certain that Apple helped American government to violate customer's privacy. It looks to me this is just a marketing stunt for post-Snowden revelation.

If Apple cared customer's privacy and security so much, how could they sell non-free software that is hard to audit, computer with baseband processor, relies on central server which allows the single point of failure.

My understanding is Apple customer don't much care about their own privacy and security but has weakness on marketing.


Can someone explain this to me? The FBI requests a new version of iOS to be installed on a single phone that was involved in the attack. What, exactly, does this mean? If the phone is locked, how will they install new software on it without unlocking? People are suggesting an update to iOS that will get pushed-out to all users, but contain a backdoor that is specific to that one particular device -- but how will the new iOS version be installed without unlocking first?


I always wondered why Apple took so much trouble with the secure enclave design, I thought it was really overkill, now I see it was really necessary for instances like this.


This is why technology companies have to go farther than implementing proprietary security systems: They have to put the capability to circumvent security out of reach of themselves.

Real data security has to be a mix of services that are friendly to reliable key exchange and strong unbreakable encryption, and verifiably secure endpoint software, which in practice means open source software where the user can control installation, that implements encryption.


Can someone explain this to me: if the data is encrypted, how does switching the operating system out enable one to read the data? I'm a layman in this area but I can only surmise that the data is stored unencrypted and it's the operating system itself that's somehow locked. If a change of operating system can open up encrypted data, then what's the point of encrypting hard drives or data sent over a network?


It doesn't. Changing the OS allows removal of the anti-bruteforcing feature(s): the delay between attempts that increase exponentially, or the delay limit of 10 where the encryption key(s) are deleted and in effect all user data. This may not be possible on iPhones with secure enclave though, the anti-bruteforcing is in part built into the secure enclave, but the iPhone 5c in question in this case doesn't have a secure enclave so it might be possible. And further they want an automated way to iterate the password. Basically they want a backdoor to make it easier to bruteforce the phone through guessing the passphrase.

So there is nothing for Apple to hand over. There are no actual keys (they're on the phone itself in an unaccessible way, practically). The court order in effect orders them to write a derivative OS without bruteforce inhibition features. They probably can do that, it probably isn't burdensome, but is it legal to compel a company to write code? Can a court order you to write a book? Or a letter? They can make you turn over facts or evidence, but it's specious they can make you create something, even if you have the capacity to create it.


• Can Apple upgrade iOS on a single device that is locked, from a new untrusted laptop without wiping it?

• Can Apple OTA upgrade iOS when the device is locked?


If they provide to the government what the government wants now, next year the government will come back with even more ridiculous request. Mr.Cook is right - it'd be great if we can avoid creating a precedent.

Oh wait they already did by providing their clients' data. Trying to stop the government now is like trying to stop a high-speed train. Still, good luck to them! Good to know they are not just pushed around without any resistance.


Wow this made my day. I think my faith in Apple's privacy concerns got a much needed revitalization. Privacy and encryption are the number one reason I stick with iPhone and Mac with File Vault. It was always hard to completely trust them after PRISM. However, that was arguably a different Apple.

This stance against the government come poetry reaffirms my faith in the genuineness of Apple'e encryption efforts and Tim Cook specifically.


If this is your main reasons to choose a OS you clearly should use Linux then ;)


I prefer FreeBSD :)

No code from [redacted] makes it an excellent choice for the privacy conscious.

I use Slackware in the cases where I need a Linux kernel. I think that might give you an idea of what I'm trying to avoid. [redacted]

Anyways, I use *BSD daily in VMWare Fusion for any development that isn't related to iOS. I also do my email and web surfing in OS X because it's simply more pleasant.


Can't they dump all the data from that particular device and then send it to the FBI? Maybe the judge will order that? Obviously they're confessing they can break the encryption but they would not do it, on principle. I don't see how they can win this fight. If it's the iphone of the shooter, and they can decrypt it, they should do it. It is not the same as to give the FBI a tool to unlock any iphone.


I heard this morning on the (semi-conservative FM radio) that this was a national security issue, and that Apple is helping terrorists in not bypassing this.

I don't get it- the shooters are dead. How is what is on their phone a matter of national security? We probably have 99% of the information we'll ever have on them. There is no larger plot. Not having what's on this device I cannot imagine puts anyone at risk.


> Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

It's all about precedent.


Surprising the FBI doesn't have a division of highly paid individuals who can crack iPhones... There are plenty of people online with a vested interest in this topic who I'm sure you could hire to help.

My guess is that this is more about pushing back the law and peoples rights than is is about getting access to this device.

But then I'm highly cynical about what the government claim they can do with technology for obvious reasons.


While basically being on Apple's side here, as I understand it, jailbroken devices are unofficial builds of iOS that have some security features removed (e.g. limits on which apps can be installed).

Is it not possible for law enforcement to get what they want from that, if all they want is a custom build of iOS that can be hacked around? And why is it even possible for that to work if the data is supposed to be kept secure?


I think jailbreaking requires you to erase the phone first.


Most previous jailbreaks required an unlocked device with passcode disabled and find my iphone turned off (because passcode encrypts things).

I'm still waiting for JB for 9.2.1 or 9.3 when released but there are already semijailbreaks (browser based installs a temporary app) and some unreleased PoCs, but Cydia MobileSubstrate and other tools need to be ported / verified too.

Perhaps if Apple allowed the devices to be officially customer hackable (like flux, springboard replacements, transmission, 3G unrestrictor and changing fonts), there would be less need to develop exploits... Unfortunately, there is great demand from governments to buy exploits and keep those secret (not a conspiracy but tools in a market)


This was his employer's phone, right? As in, it was government-owned property being used in the course of terrorism. Were they using Apple's Mobile Device Management (MDM) framework or some other form of key escrow? If not, why should Apple bail out a government entity, at the expense of its own customers and security, that couldn't even be bothered to follow best practices?


There is one way to brute force an iPhone called IP Box. It's a hardware device which can brute force a 4 digit pin in ~111 hours. http://blog.mdsec.co.uk/2015/03/bruteforcing-ios-screenlock....

But it only works on iOS 8.1 or earlier, was patched in iOS 8.1.1


Does anyone know what the consequences for Apple will be if they keep refusing, but the courts say they must?

Massive fines? (we know they have the cash to cover it)

Jail time for execs (whoa!)

?


This is now the most-upvoted story HN has ever had.


And apparently #2 is also by Tim Cook? http://www.hntoplinks.com/all


That's #4, behind 3078128 and 3742902 (though that one cheated).


This is very disappointing letter for me. It means that Apple can indeed build a backdoor into existing phones, they just don't want to do it (or so they speak). I was under impression that Apple employs security hardware which protects keys and makes impossible to penetrate that defense. If it's not the case, iOS security is not as good as it could be.


Other commentary suggests this capability is targeted at the iPhone 5C, and that at least some newer models do have protections against this technique.


On their newer models, yes, there are hardware-based protections.

The guy in this case didn't have the newest model iPhone; he has an older 5C which is at least theoretically vulnerable to Apple being forced to push a bad software update to it.

It's hard not to see the effects of these things showing up in the way they design each successive generation of iPhones; each time, they add something which gives them more ways to say "nope, technically impossible to do what that court just ordered us to do".


This is all brave talk until they publicly say the same thing to China, until then this political bluster. http://qz.com/332059/apple-is-reportedly-giving-the-chinese-...


Good on them. I was hoping that they'd be able to manage a way to unlock this one without potentially breaking the whole model (by exploiting some bug in the presumably outdated version installed or something that wouldn't positively degrade the security model), but given that that's not the case then I think they're making the right choice.


I'm betting there are similar vulnerabilities in the current "Apple doesn't have the keys" versions of iOS and the hardware. For instance, do a similar mandated firmware update to the secure enclave, and now you get unlimited guesses at a PIN.

edit:

Ah, I've found a couple of sources claiming that the secure enclave wipes its keys if its firmware is updated. Makes sense.


It may well be that Cook's stand will soon become unworkable in the US. The US is always at war, after all, at least effectively. I wonder if Apple would just leave. It's already earning ~60% of revenue outside the US, after all. And hey, it's sitting on tons of offshore cash. Maybe it could build its own country on an unclaimed reef somewhere.


It is worth pointing out one salient fact: the phone in question did not belong to the shooter, it belonged to the shooter's employer, which in this case is the county government. That makes Apple's position much less tenable because the owner of the phone is (presumably) consenting to -- maybe even actively encouraging -- the recovery of the data.


What I think Apple should (also?) do is appeal to both the law enforcement themselves, and the government - basically go "All secret communications from law enforcement and government figures - up to the President - would be at risk", or something to that effect.

I doubt the ones giving these orders would be comfortable with their own privacy being at risk.


"For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them."

Sounds just like gun control :)


If I were a betting man I would put good money on the bet that a bypass exists and is well known to the government.

What parts of the government is a different matter.

This is a perfect setup. Get all the bad guys to run out and buy iPhones (good for Apple) believing that they are safe from the US surveillance machine.

Then the appropriate agency can slurp up whatever it wants.


Maybe I am missing something here, but the Washinton Post says "Federal prosecutors stated in a memo accompanying the order that the software would affect only the seized phone". What is so wrong with that? If they just use it only on this phone? Or is that the weapon has been created and could be used?


Apple is sayong that the federal prosecutors are wrong in this assertion.

Apple is saying that any solution that is applied to specifically this phone can trivially be generalized to all other iPhones (or, at least other iPhone 5cs). Further, unlocking this phone in response to this order establishes a precedent that this is okay. You are much better off legally if you fight the first request than if you fight the thousandth request.


" Or is that the weapon has been created and could be used?"

I'm pretty sure it's this. Once it's created it's only a matter of time until it's leaked and anyone can use it.


I don't see how this message is reassuring. Are they expecting the customers to just take their word? Without Apple showing the world, every bit of software that they run on their phones, these statements are at best, meant to mislead the users that Apple is doing something on the user's behalf.


I wonder why no one pointed out that privacy boils down to trust:

That letter might be the truth or could be some kind of decoy. Maybe the backdoor will come and Apple knows that already and they try to limit the damage to their brand.

Like "we tried to resist having a backdoor installed, but we couldn't do it ultimately".


Am I wrong to think that this brute forcing can still be applied when the raw memory chip is taken of the iPhone? The wipe-all-data-feature requires write access to the chip + some intelligence and monitoring. These capabilities should be physically removable from the actual memory chip, right?


read section 'Hardware Security Features' here: https://www.apple.com/business/docs/iOS_Security_Guide.pdf


Ok, so: "The UID allows data to be cryptographically tied to a particular device. For example, the key hierarchy protecting the file system includes the UID, so if the memory chips are physically moved from one device to another, the files are inaccessible. The UID is not related to any other identifier on the device."

The secure enclave must still give it's UID under some circumstances? This still does not appear to be immune to hardware hacking.

Moreover, this UID can also be brute forced imo, when the memory chip and secure enclave are physically separated. Whatever is needed to de-encrypt the data must be brute force-able, especially when the memory is separated from the wipe-all-data initiator which does not seem to be impossible if you know the chip design well enough?


Unless there is a bug in their hardware implementation of AES-CCM or (shudder) some sort of crazy disclosure vulnerability in the APIs they provide, there is (presumably) no way to get at the UID. Even if you were to decap the chip and get at the UID physically, you still aren't any better off as it derives the actual encryption key on boot from the UID.

The Secure Enclave is essentially a hardware security module, in more general terms. The only thing that leaves its boundaries are the results of crypto operations, not the parameters that went into calculating them.


Thank you. So you mean to say that making an electron micrograph of this chip will not even reveal it's secrets? If the data persists after removal of power, some physical structure contains the data. Somewhere a hash of the fingerprint/password needs to be stored, then somewhere the function to compute that into an AES cipher needs to be stored.

I'm going on and on about this because I see no way in which this problem is not down-boil-able to brute-forcing the password the user puts in.

Actually Apple admits this much! They can build a work around! What stops the three-letter-agency from building it?

There must at some point be a complex user entered passphrase if you want to be safe. This can be a fingerprint of course but there is always the 4 letter password/passphrase that is the weak point.

I could be completely wrong, so far I'm not convinced I am.


on page 11 of the security white paper, there is a diagram illustrating the key hierarchy.

i believe that 'Hardware Key' refers to the UID. if this is the case, then learning the UID + user passcode gives you the root keys, from which you can decrypt the remaining key hierarchy until you reach the decrypted files.


"The device’s unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused (UID) or compiled (GID) into the application processor and Secure Enclave during manufacturing. No software or firmware can read them directly; they can see only the results of encryption or decryption operations performed by dedicated AES engines implemented in silicon using the UID or GID as a key."

re: brute forcing. they are AES 256 bit keys. good luck.


Where is the reply button under your comment chillaxtian? Edit: hmm, now it suddenly appeared...

How about disabling the wipe signal by cutting some interconnects in the silicon?

Is the function by which the AES keys are calculated from the password really completely inaccessible?


If you are doubting the documents, then exactly what evidence will persuade you? Sounds like your mind is already made up.


Techie question: if Apple can compile a neutered version of iOS to bypass encryption, why can't a hacker (or US govt nerd) at least in theory reverse engineer iOS and patch it accordingly?

(guess answer: iOS needs to be signed. So what they are really asking of Apple is to sign a lobotomized iOS image...)


What's all this talk about pushing updates to locked phones? I have to get involved every time there's an OS update for any of my iDevices. That damn red dot on Settings.app just stares at me while I try to find a time I'd like to be without my device for half an hour.


A company with courage. Hard to believe when virtually no institution, government or corporate has it.


> People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes

I wonder if this is a grammar mistake, or Apple actually considers the private conversations, nodes, photos to be theirs?


I think this is grammatically correct, Tim Cook is just identifying himself as part of the group that store personal information on their iPhone.


The real security risk is the ability to update the phone's OS without authorized user consent at least as strong as the original protection the FBI are trying to break.

Right now it all hinges on Apple's private key and that's a very thin wire to hang all this privacy off.


Many may hate Apple, however it's undeniable that they're so committed to user security.


Way to go Apple.

And Edward Snowden just tweeted this a few minutes ago in response to another tweet proposing Google back up Tim Cook: "This is the most important tech case in a decade. Silence means @google picked a side, but it's not the public's."


Since Apple is part of PRISM[0], the FBI can just ask the NSA.

[0] https://en.wikipedia.org/wiki/File:PRISM_Collection_Details....


No: data held on Apple servers (iCloud) can be provided with a warrant, but this is about data held on an iPhone but not saved to Apple servers. As a result, neither Apple nor the NSA can immediately provide it, and some means of hacking the iPhone is needed.



"In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."

Someone who believes in conspiracy theories would make a statement that "now it is official" :)


Being realistic, ¿how many fewer iphones will apple sell if they remove the SE? ¿How many people will not buy an iphone if they are told that their info can be accessed with a judge's warrant? I'm guessing a 0.1% drop in sales?


Cook wrote that "this software ... would have the potential to unlock any iPhone in someone’s physical possession." (emphasis mine)

Is that true? What if it's locked with a secure 128-bit (e.g. 10-word diceware) passphrase?


I believe you are correct. Cook probably meant that it would unlock any iPhone with the standard 4 or 6 number passcode.


Wow. This is the first HN submission to exceed 5,000 points!

To honour Tim, and his advocacy for our industry, I'm going to spend the rest of my week developing privacy/security projects. I encourage everyone else to do likewise.


I wonder what will be response of other manufacturers making phones with Android.


Can they publish a copy of the FBI letter. Otherwise, Apple's description feels a bit circumstantial and opinionated. I feel like I can make a better judgement on this whole issue if the request is made public.



I am happy AAPL is taking this stance. But I can't help but believe that is has very little to do with liberty, and very much with the bottom line. Either way, I guess we should be grateful for little mercies.


What are the odds that Apple has been ordered to do this before, but every other time they were asked it was in a FISA court? That would mean that this is the first time they've been allowed to talk about it.


Is it possible for a human just to try all 9999 passcode combinations? Assuming the 10-failure erasure is switched off -- a bad assumption, I know. Is there an additional slowdown after a lot of failed attempts?


Heads up that I just recently discovered - if your iphone has touch id enabled, you can go into the touch id settings and selectively disable touch id for phone unlocking while keeping it for the app store.


I'm sorry but "Smartphones, led by iPhone"? Bit presumptuous.


Apple's OSs are closed and therefore inherently unsecure. When Apples caves, and they undoubtedly will, it will have the beneficial consequence of being a boon to open source communications software.


Google CEO Sundar Pichai has thrown in with Apple in a series of tweets explaining his position.

https://twitter.com/sundarpichai


Kudos to Apple for standing up to the US government and stand by its users.


The court order was posted on HN hours before this letter and eother Tim Cook has not rrad the order or he's lying about the back door. What the court ordered was the removal of the auto-wipe.


Question: is it possible to design a cryptographic system that, whenever it is accessed by a third party (government), this is made publically visible in a log? Can blockchain technology help here?


Interesting, what if the security on the device was reduced purposely and in a manageable way, in exchange for one of the several steps or keys to decryption existing only in an online blockchain of some sort?


No, because I put the device in a faraday cage, with whatever proxies I need, crack it, then put it in a woodchipper. No one ever finds out.


Existence of such system wouldn't make much difference, because no authority would wan't to use it to access the data. They would claim that the target's inability to detect a wire is crucial to the ongoing investigation.


But what if you need the blockchain to crack it?


I think there are two orthogonal questions:

* Does Apple pretend the FBI cannot access to its devices?

* Can the FBI access to its devices?

The only thing we learn here is the answer to the first question. We know nothing more for the second one.


What happened that now the companies can talk about these gov requests? The most nefast thing in these gov orders about terrorism is that the companies were forbidden to discuss it publicly.


"Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE."

People hyperventilating that the tool could be used to crack other phones can relax, given the last clause in the quoted text (from the actual order).


The numerical passcode is likely his ATM pin, or a code from his bank/PayPal or some such. I hope the government can simply subpoena his bank/PayPal etc and this will end at that.


"We have no sympathy for terrorists."

They felt the need to state that, huh?


I was surprised that they felt it necessary but its pre-emptive counter to a pretty common argument - that smartphone makers are aiding and abetting terrorists by providing them encryption tools.


Well yes; look how this could play out on Main Street: "Apple supports terrorists by opposing Obama Admin's FBI investigation."


Is there any doubt that when the FBI brings up a law from the 1700's to justify breaking digital encryption in 2016 that they are completely making it up as they go along?


Sure. Laws don't have to be so specific and they are typically better if they aren't so specific. The law basically says courts can order anything to upload the law. Encryption is part of anything.


The way I read this, is that Tim Cook has and said it can't be done, only that it shouldn't be done. This leads me to suspect that Apple can decrypt your phone, and they know precisely how to do it, but in doing so would disrupt their entire marketing campaign around safe and secure encryption.

I'm just a government relations guy, not a security person, so please forgive me, but I'm not sure where I fall on this. I want the FBI to be able to decrypt the San Bernardino attackers phone. The same time, I don't want the government to be able to decrypt my phone. This is one hell of a damned if you do, damned if you don't situation, and I'm really stuck.


Everyone in the U.S., please write to your Congressional representatives and also to the Presidential candidates you support. They need to know they can't get away with this.


Possible or not, the FBI seems to have formalized the issue using this opportunity. They are asking the questions they have been wanting to ask since the release of smartphones.


Didn't they try this with another (smaller, less publicized) case about 4 months ago?

https://www.eff.org/deeplinks/2015/10/apples-eula-gives-it-l...


Am I the only one buying a new iPad because of this announcement?


My guess is that it's likely that the FBI can access the data without Apple's help. Based on what we know, how do we distingish between these two situations, and which seems more likely?

A) Apple has created unbreakable security. The FBI cannot access the data and needs Apple's help.

B) iPhone security, like all other security, is breakable. iPhones are a very high-value target (all data on all iPhones); therefore some national security organizations, probably many of them in many countries, have developed exploits. The FBI, following normal practice, does not want to reveal the exploits or capability and therefore must go through this charade.


I can't recall any previous instance of a mega corporate opposing the tyrannical US Govt. I fully expect Apple to lose here but it is a valiant and rare effort.


Apple stand is a bunch of BS. The main issue should be that the safety and protection of humanity. Terrorists are not humans they are bent on destroying humanity.


I thought Apple already had backdoors. I feel relieved that my iPhone is not backdoored and I'm also very happy for a company who's products I use daily.


So this was a work phone owned by his employer. Does that change things? Surprised they didn't have IT software installed already to monitor the device.


Are iphone hard disks (and the files within it) and cloud content encrypted based on this single private key that is stored in the secure enclave on iphone?


Wouldn't many countries like Russia and China stop allowing the sale of iPhones or at least their use by government officials if the FBI succeeds?


Actually, someone other than Apple is already able to do the requested things in the court warrant (brute force passcode from a locked iPhone) - ih8sn0w has an iBoot exploit for A5 chipset (same as iPhone 5c), so he can probably boot an unsigned kernel, and use some public tools already published to crack the said passcode. If some lone hacker can do it don't be fooled for a minute that NSA can't, or that the feds couldn't buy something similar from another hacker. This is Apple covering their P ass from the press.


Good for them. Freedom comes with a price, sometimes that price of freedom is protecting the privacy of the worst of us to protect all of us.


HN is the first place I came to for discussion on this and I just wanted to thank you all for keeping it civil, intelligent, and objective


I have never been more proud to have worked for Apple. Tim isn't afraid to give the government the old double forks when it counts!


Does anyone have a decent architectural overview of iPhone (6)? Security - these enclaves etc sound good but devil is in the details



This is a great way to build public awareness for this issue. Hopefully this will allow more people to get involved in the fight.


Applying an update to break encryption would violate chain of custody and render the information obtained inadmissible in court.


The fact that Apple indicates that they would be able to produce such a software version is in itself a backdoor in the iPhone.


Correct. What this letter says is that if Apple wants to get around the wipe restriction because they want the data themselves, they can do it. This means that they will build the backdoor as soon as they are legally compelled to, but in the meantime, they can use it for a PR stunt.


In the future, once terrorists have TouchID iPhones, couldn't they just use the corpse's finger to unlock the phone?


Touch ID can't be used if

- The phone was turned rebooted

- It's been more than 48 hours since the last time the passcode was entered

- The user didn't set up a touch ID fingerprint


Remember, iPhone's are available world wide. If the US wants to play world police, then I want a vote in the US election.


Threat to humanity should trump all the garbage Apple and its lackeys are spewing out. Terrorists are not human.


As much as I would love to believe in Apple (and any other large tech company), a part of me still thinks that maybe they are working with the government in this letter. The FBI knows that the average US citizen does not want to be hacked. What is to stop the FBI from allowing Apple to say these things and put on a show publicly while simultaneously giving over the 'master key' anyway?


And what happens to the engineer tasked with writing this hack if he fails and ends up bricking the phone?


I'm a libertarian. But islamic terrorist phone is just evidence - Apple must unlock it for the FBI.


It took me a bit, and I believe no one has summarized this very well yet.

FBI: "You've built a device that makes it nation-state-difficult to install custom software without DRM keys. We'd like you to assist us in deploying software signed with your keys."

Apple: "That feels way too much like asking a CA to sign a cert for you, so fuck off."

I'm honestly not sure which side I'm on here.


So only Apple has the ability to do this...not the US government. So we trust Apple but not USG?


I can't read it from the letter - are they going to refuse to cooperate? Can they do that?


They're not 'refusing to cooperate' in the sense of just ignoring the court or something. They'll file a motion to have the order lifted, with their reasons. If that's refused, they'll presumably appeal the refusal to a higher court, and so on, to the max extent they can. That's not being obstructionist, it's their legal right.


And the government wonders why people from tech don't want to work for it.


All of a sudden I'm starting to think my PiPhone is looking pretty good.


Can't they dump the drive's data to protect it from being erased?


Hmmm. If this pans out in Apple's favor, I may finally buy an iPhone.


You don't own apple hardware, so you can't protect your device.


I think Apple tried to prove that they don't give any user data to agencies. PR stunt. But actually they got it fundamentally wrong as this was actually case of national security and the attack that happened. So huge PR stunt, but a own goal


sounds like the backdoor already exists, but only Apple knows how to use it. same as if Apple knew a master password for this phone but refused to give it. they are saying they don't want to give it because once the FBI has it, then they are free to use it anywhere. pretty strange post from Apple.

probably they try to fight this request by arguing that the government is actually asking them to effectively remove security from all the phones (of this model at least). they would be happy to help break this one phone as long as it doesn't affect any other phone.

in that case, then Apple should just break the phone and give it back to the FBI after removing the backdoor.


Plot twist.

This is actually the result of a barter. The Gov gets to have some low level TOP-SECRET access in trade for this easy access code and that Apple gets to go public to keep the populace calm and pretend they are fighting this thing.


This is the FBI going after a Parallel Construction path. They already have all the information from the NSA bag o'tricks, but none of those can be used in court. But an unlocked phone unlocks the legal obstacles.


Doesn't the phone belong to San Bernardino County?


Mods: can you please update title to add some context?


well, no one protected from thermo-rectal crypto-analysis. The only difference is that gov guys want to keep it hidden from target


seems that this forum is 'moderated'. Views that don't kiss the self proclaimed savior of freedom are deleted.


To all in-love-with-Apple downvoters, please read this Schneier sound analysis of the same type of situation that RIM(Blackberry) has been met with: https://www.schneier.com/blog/archives/2010/08/uae_to_ban_bl...

/quote: "RIM's carefully worded statements about BlackBerry security are designed to make their customers feel better, while giving the company ample room to screw them." /endquote

I have lost enough points on this thread to simply double down on this issue.

This is not a good sign at all. While Google can't compete with Apple on the principle of "not spying on their users". All Apple has to to is to publicize it and then ask for forgiveness from it's users later.


"This is not a good sign" != "This is a bad sign".

I've re-read your comment several times, and I don't get how it's novel or how it applies here. Of course it could be true in this case, as it could be true in any decision any company makes. But I don't see how it's insightful or proves or suggests that Apple is doing the same.

You're not some kind of martyr for the anti-Apple cause here. I think we all know that Apple could be saying one thing and doing another. That doesn't mean that they are, and it doesn't mean that this open letter is proof that they are.


As a software developer i'm always looking for the real bug. Weapons kill. Not Iphones.


Cry, US Government!


Dear Tim Cook,

Thank you!


A friend of mine at Apple reported multiple Black Vehicles (Lincoln Town Cars and Escalades) with at least one having MD License Plates at the Apple Executive Briefing Center this morning between 11AM and Noon. Occupants had ear pieces and sun glasses and were accompanied by a CHP (California Highway Patrol) cruiser and three motorcycle escorts. I suppose it's possible this was a quick (less than 1 hour) VIP stop but given Tim's message last night, as well as the reaction of folks on campus who were bandying about comments like "I don't want to work on this or because I don't want to be deposed" the impression certainly was it was not a friendly visit. Given Tim's very public push-back I'd think delivery of an NSL with accompanying intimidation is at least possible. I submitted this HN and updated in real-time. There's a bit more discussion here:

https://news.ycombinator.com/item?id=11120365


Save people like me a trip to the Google: NSL = A national security letter (NSL) is an administrative subpoena issued by the United States federal government to gather information for national security purposes. NSLs do not require prior approval from a judge.


Not only do NSLs not require approval from a judge, they also include a very intimidating gag order that prevents you from discussing the issue with anyone else (including even your own family).

One of the big problems with NSLs is that you can't let anyone know that you've received or acted on one, so there's very little accountability.

Hence the recent trend of some companies including a warrant canary on their websites, under the assumption that a NSL can't prevent you from _not_ saying something (e.g. deleting the canary).


I think deleting something would be considered an active action. The trick with a canary is that you're choosing not to do something, so it can't compel you to act (as compared to, for example, telling you you can't delete the canary).

So for it to work, you need to issue a statement every month that says you haven't been issued a NSL, and then simply not issue a statement the month you finally were issued a NSL. That would then require the government to actually compel speech (compel you to post a new notice saying you didn't receive a canary).

Of course, the above should make it blatantly obvious how absolutely absurd the blanket gag order on NSLs are.


It's not necessarily deleting anything. Reddit states in its transparency reports that it has not received any NSLs. The idea is that if they ever received one they would simply omit that clause from the next report (not remove it from prior ones).


Interestingly, if you accept that code can be copyrighted, and that only things that are expressions (speech) are eligible for copyright ("A copyrighted work must be an original work of authorship which is fixed in a tangible medium of expression"), then code == speech.

So, by compelling Apple to code something that doesn't exist, the government would indeed actually be compelling speech.


Court processes involve compelled speech all the time. Heck, compelling witness testimony, which is one of the most well-established parts of the court process, is nothing but compelling speech.

So, I'm not sure what the value is of a clever argument that compelling Apple to comply with the order here is "compelling speech" is supposed to be (likewise, the upthread one about NSL canaries.)


Fair points that I concede. I'll note I also forgot the nuance mentioned by morsch (that a canary is compelling a lie, which is potentially different).

On a separate note though, I've always thought it would be interesting to see a member of Congress be issued a NSL and then have them read it on the floor of the House/Senate (since they have parliamentary immunity for anything they say on the floor of the House or Senate).


Compelling speech is a thing that happens under certain circumstances. E.g. subpoenas, witnesses in court. Though those are clearly different from a canary, not least because producing such a canary would not just be compelling speech, but compelling a lie.


That's not how canaries work. You don't delete them, you fail to update them.


IANAL, but how is it a subpoena if it doesn't originate from the judiciary?


Nicholas Merrill famously fought a 11 year legal battle (and finally won) the right to reveal all aspects of a National Security Letter (NSL) served to him. Almost all such letters are accompanied by a complete gag order.

https://www.calyxinstitute.org/news/federal-court-invalidate...

EDITED / CORRECTIONS - Thanks commenters - The battle was won by Nicholas Merrill not Ladar Levison of Lavabit fame as I originally posted.)


It was Nicholas Merrill from a little ISP called Calyx Internet Access that famously challenged the NSL process.

LavaBit's Lamar Levinson is assumed to be under a gag order from some request he was given by the US government, of which he declined by way of folding his company and claiming that he could not comply moving forward if he was no longer the middleman of some form of communications.


As a further correction, the Lavabit founder's name is spelled Ladar Levison.

Rather than making assumptions, you can read about the specific kinds of legal process involved in the Lavabit case at

https://en.wikipedia.org/wiki/Lavabit

You can also read the Fourth Circuit decision on his appeal, among other things.


I guess you can argue semantics, but it's an order accompanied by a credible threat of violence if the order is not obeyed.


I know it requires Tim Cook to be willing to martyr himself, but do we really see Obama whisking the CEO of Apple Computer off to Guantanamo or some supermax prison?

I'd maybe call the bluff, and take my political stand.


Cook wouldn't have to go that far. The court order specified Apple, not Tim Cook personally. He can simply resign instead of following the court order. For that matter, so can the engineers that Apple would need to work on this project.


That's interesting.

IANAL but seems like you're missing something.

Apple can simply let Employee B take Employee A's place after A quits. When the authorities come for B, B can quit, and Apple can re-hire A.

Apple never has to comply.


Employee A doesn't have to actually quit, they just have to credibly threaten to. Say, by signing an open letter that says that they'd quit before helping backdoor the iPhone. Apple can then claim that they cannot bring together a team that is willing and able to backdoor that iPhone.

When you break down the process of having a private company comply with an order to create a particular piece of software, there's many failure points.

The counter from the governmental side is "we will give your company massive fines until and unless your company complies".

As a note, the actual text of the court order (https://www.documentcloud.org/documents/2714001-SB-Shooter-O...) explicitly says that Apple can appeal it on grounds that it is an unreasonable request. Uncooperative engineers can make it an unreasonable request, and have the legal right to be as uncooperative as they want to be in this case. And, they're on the same side as the CEO of Apple ethically, so it isn't career suicide.


> The counter from the governmental side is "we will give your company massive fines until and unless your company complies".

At which point is becomes worth it for Apple to pay an engineer to do the job. I doubt it wouldn't take much of a bonus to get someone to do it.


> > The counter from the governmental side is "we will give your company massive fines until and unless your company complies".

> At which point is becomes worth it for Apple to pay an engineer to do the job. I doubt it wouldn't take much of a bonus to get someone to do it.

What happens if Apple says they aren't paying these unjust fines? Theoretically, court order, law, or what-have-you, Apple can just straight refuse to participate (and hopefully other big tech companies would follow suit).

Sure the gov't can make arrests, threats, seize assets -- but in the end, the gov't still don't get what they want (but they do get a ton of very, very bad PR in the process). At a point, the gov't would have to stop -- destroying the world's most valuable company, and one of America's sweetheart companies, all over this... wouldn't play out well.


> What happens if Apple says they aren't paying these unjust fines?

Then they'll be subject to additional penalties, seizure of property, etc., and quite possibly shareholder lawsuits stemming from the decision to incur those losses.

> Sure the gov't can make arrests, threats, seize assets -- but in the end, the gov't still don't get what they want

Maybe, given the recent discussion of mandatory limits on encrypted communication services without up-front backdoors, what the government wants is a clear demonstration that the operation of those services interferes with evidence and intelligence gathering in terrorism cases to build the case for new laws restricting the operations of such services.


Yeah, I mean it seems like the other commenter is implying that, at the end of the day, the government can't actually do anything to a big company. Clearly that's not true. The Us government has completely broken up companies, which is a heck of a lot more intrusive than forcing them to decrypt phones.


> implying that, at the end of the day, the government can't actually do anything to a big company

I wasn't trying to imply this at all - of course the government can destroy companies.

What I was implying is that, given the past few years of heightened public awareness of domestic government programs and efforts, perhaps this time public pressure would be exerted on the government to lay off the issue. Apple is one of the most loved companies in the country, people would be very interested in knowing why it's suddenly being torn down.


I think he's arguing that government's power to destroy big companies is only as strong as the popular distaste for those companies.

A company like Apple, which is practically an icon of "everything America still gets right," would be a very politically dangerous target to go after.


It would be quite unfair for anyone to expect martyrdom of Tim Cook, however, the action taken could send a very powerful message. Apple is one of the most loved companies in the US, if Cook and gang put the foot down firmly, surely the public struggle would generate immense discussion.

People would want to know why their favorite company was drawing a hard line, hopefully lead to a more educated debate.

We couldn't possibly expect this to take place as it would imply huge fallout for Apple. The only real course of action is to keep up the public debate in a loud way.


If an order is directed to Apple, and Apple fails to comply, the court can order sanctions against Apple for contempt.

The authorities don't have to "come for" any person (they might follow up with orders directed at particular persons, in which case those persons would be at risk of personal sanctions, as well.)


I presume it would just be significant fines, or perhaps some FCC or FTC regulations that would harm Apple's business.



This is all very standard for a VIP visit and the president of Indonesia was at Facebook and Twitter today. So there's probably a simpler, less exciting explanation here.


FYI the DOJ/FBI does not utilize Lincoln Town Cars or Escalades for ops. You'll want to keep an eye out for Chevy Impala, Ford Crown Victoria, Chevy Suburban, and in rarer cases Dodge Charger (no spoiler) and GMC Denali. Typically they're darker colors, but the easiest indicator is a massive amount of tint on all windows. Remember, official vehicles will have the default state license plates without any sort of bumper stickers or license plate holders. In Maryland, unmarked LE vehicles typically use an older license plate instead of the newer design.


Many Apple employees are not allowed to refuse. I suggest that any employee, especially those on any sort of work visa or with any connection to the military/intel community, seek legal advice prior to signing or saying anything. There are big forces involved here that will not hesitate to squash/fire/blacklist little people.


Weirdly enough, I saw a blacked out SUV with Virginia plates and a CHP SUV in front of our office at 2nd and Harrison that looked extremely out of place. Among other things, it's a no stop area on our block and there isn't really much here to begin with.

It hung out for a few minutes and then made off taking a left on Harrison (101 South, FWIW). This was around 12:30p.


> at least one having MD License Plates

Don't vehicles used by federal agencies typically bear U.S. Government license plates?


According to @HillRat in the other thread : "Based on open sources, NSA has a fleet of about 100 vehicles at most in CONUS, so they would be more likely to fly vehicles cross-country if DIRNSA needed to use a fleet vehicle. Having said that, NSA doesn't just arbitrarily get involved in domestic law enforcement cases, even ones involving crypto, so let's wait to see if there's more evidence than a random HN post to support this allegation." Good information and an entirely fair statement. I'm hoping someone from Apple can chime in directly.


The majority of federal government travel is done with commercially available rental cars.

The likelihood of the federal government driving a GSA-plate vehicle from Maryland to California to meet with Apple is about zero.


Commercially available rental vehicles seem an extraordinarily bad idea for the NSA to drive to and from sensitive meetings in.

What? They just drive silently to and from the meeting or listen to the radio?


"I don't want to be deposed"

I'm not familiar with that phrase. What does it mean in this context?


A deposition in the American judicial system is a legal proceeding in which a witness provides testimony to the courts. The witness may be examined by members of either sides legal team. This is a routine proceeding in civil cases. [0]

Interestingly, people usually don't want to be deposed when they have something to hide.

[0] http://litigation.findlaw.com/filing-a-lawsuit/what-is-a-dep...


> Interestingly, people usually don't want to be deposed when they have something to hide.

Or because the process is a huge hassle to deal with, it sucks up days and days of your time, and it's very easy to mess something up.


"'I've Got Nothing to Hide' and Other Misunderstandings of Privacy" [0] is a really good paper on why the argument of "only bad people have secrets" is such a fallacy.

[0] http://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=1...


> Interestingly, people usually don't want to be deposed when they have something to hide.

You don't need to have done anything wrong to fear a deposition. Go to youtube and watch any number of "do not talk to police" lectures. An innocent person can end up trapping themselves in a lie, or bow to pressure. Or you may say something that you think is harmless but in fact damns you. Depositions should be avoided wherever possible.


> Interestingly, people usually don't want to be deposed when they have something to hide.

Interestingly, people usually don't want to be deposed, period.

Sure, its a routine proceeding in civil cases; guess what, people generally don't want to be involved in civil cases -- as plaintiffs, defendants, or called witnesses -- either.


> Interestingly, people usually don't want to be deposed when they have something to hide.

That's a pretty... er, dubious claim at best?


I mean, it's an accurate statement. It's just also true that people usually don't want to be deposed even if they have nothing to hide. The "when they have something to hide" part is just superfluous.


It's not just superfluous, it changes the meaning of the statement. It's like saying "I like women when they aren't driving". You can't follow that up with, "oh, but I also like them when they are driving, so it's okay to say that".


No, people don't want to be deposed because they want to do their job and mind their own business, instead of get pulled in to a national spectacle, be inconvenienced with new travel plans that require child care arrangements or other disruptions, and potentially face stressful scrutiny on record.


> Interestingly, people usually don't want to be deposed when they have something to hide.

You know who really talks a lot about their rights? Terrorists.


This is both patently false and the single most deeply un-American mindset you can have.

Your statement is so deeply naive and insulting to the patriots who fought for the rights you enjoy today.

Those who made major strides for:

1) Black Civil Rights

2) Gay Marriage

3) Women's Rights

4) Anyone who fought government intimidation based on speech

5) Democratic Socialists

6) Abortion Activists

7) Environmentalists

8) Your workers rights (i.e. your weekends and pay)

10) Many more... who made sure you could help move society forward and make it better for everyone.

Edit: I assume the downvotes are coming from those who would rather we lock up the above than have a free American society.


I didn't downvote you, but you're probably being downvoted because jonathankoren was sarcastically mocking leesalminen's "the innocent have nothing to hide" attitude.


As sibling says, you're being downvoted because you've failed to recognize sarcasm.


Yes, the Founding Fathers of the USA probably were considered terrorists, and they talked an awful lot about freedom.


Please tell me you said 'Interestingly, people usually don't want to be deposed when they have something to hide.' off-the-cuff.

I would associate this view/tone with a corrupt, small-town, sheriff in a movie, rather than a real, human citizen who shares the the values of our society, which includes due process, the right of the accused, and presumption of innocence.



Depositions aren't usually that fun regardless of what you have to hide. I'd probably want to avoid one if I had the chance.


According to my source at Apple HQ a TV van is on the way. We should see some news coverage soon.


[flagged]


Nothing. Bringing a bunch of special agents along with you to a meeting is intimidating, though.

I suppose in their defense, they may be the particular agents working on the San Bernadino case, who arrived to explain exactly why they need the access or whatever.


[flagged]


The way powerful people often maintain control of a situation is through contrivances such as unusual dress, unusual ways of speaking, unusual rituals or by having a large entourage.

If you analyse each element closely it's clear that they're silly. Why do they need earpieces in when visiting Apple? Why must all the cars match? Why must they dress in the same way? Why do they need to bring all those people, what are they all doing?

But the point is, when faced with something new and strange, people often pause and withdraw. This allows you to dominate the situation and prevent dissent.


Earpieces are probably worn routinely, just as many other people you may see walking around.

Large organizations typically procure fleets of vehicles that are all the same model. This reduces development and maintenance costs.

Many institutions, particularly law enforcement, have strict dress regulations and many distribute their own uniforms appropriate to the particular position.


Yes, and those things are done for the exact reasons the parent comment mentioned. Even comedies like Brooklyn Nine Nine recognize this fact (one episode had a bit about a detective refusing to wear a tie). They wouldn't bother with uniforms if they had no effect.


Its intimidating for the same reason that having the kgb show up for a russian companies meeting would be intimidating.


Special agents who wear earpieces are people who have been trained to respond swiftly and decisively when violence is called for. That's what the earpiece is for, to coordinate tactical action if necessary. Having a swarm of them show up at your place of business is obviously intimidating even if they're not actually planning or expecting violence.


You would expect the higher level officers and directors to be in plain clothes, normally.


under what kind of pressure would Tim write this public letter?


There needs to be a distinction between state security and "retail" security. State security agencies have the legal framework to compel Apple to do anything and not even talk about it. What I call "retail" security is any act by any legal enforcement agency in the country. Their requests are bound to be in large numbers and for all kinds of things. On top of that, these requests, apparently, are not yet covered by a legal framework. Hence the need to force upon an old law to try and make Apple comply.

What's at stake for Apple is not only their principles but also one of their marketing pillar: "you, the user, can trust us with your data/privacy." By asking Apple to give that up, and quietly, you actually are asking them to undermine their business model. Shareholders will not appreciate that if they wouldn't have a chance to hear about it first. The Apple brand would lose from its value and it would reflect in the AAPL share price.

My point is that the whole thing needs to have legal backup. And Apple is asking for this exact thing: give me a law to use. And not something from the 1700's.


If it is possible to build the requested OS, then it can be said that the iPhone already has a backdoor.

If the device were truely locked down, there would be no aftermarket solution to unlock it.

My understanding is that Apple was asked to supply software that would prevent their software from destroying evidence on a particular device. They should comply with this order, especially given the device in question.


That's true for the iPhone 5 and earlier but for the iPhone 5S and later Apple actually made it impossible (see secure enclave). But it's not about that but rather the legal implications this has - it would set a precedent allowing the government to basically compel any company to provide keys to decrypt information which is a huge blow to privacy.


It's hard for me to have respect for an organization that was built by J. Edgar Hoover, a person who did not respect the law or American's rights.

The philosophy of corruption and oppression still echoes throughout the FBI. Even today, there are FBI agents that work for private interests. You can't reform a mafia, you must abolish it and start over.


Cook says any iOS device could be breached if this software were created. But other articles have led me to believe that any iOS model with touchid is immune due to the secure enclave being in play even for non-touch passcode access. Is this wrong?


Tim Cook admits iOS is already back-doored in the most weaselly worded message I've ever seen.


[flagged]


Please don't post unsubstantive comments.

We detached this comment from https://news.ycombinator.com/item?id=11116803 and marked it off-topic.


Gotta give it to Apple, they sure know how to pull off a PR stunt.


Well they seem to be saying that the approach they describe, to make a modified OS, would actually work to circumvent encryption on a preexisting device. That means that they already know the device is not really actually secure.

They aren't talking about putting a back door into systems to be used in the future, they are saying it's indeed feasible to place a backdoor on a device already out there and then use the backdoor to access the device. That means the device is not actually secure.


What im reading is that apple can remote install an update that disable encryption. They dont want to do it.

But that they have the capability is a bit scary.


You're not reading that, you're implying it. It doesn't say anything at all about remote installs, and it's not even suggested.


It doesn't say anything about remote install - in fact it says "physical possession" several times.


Come on. It can be iOS update that waits until you enter the code or use touchID - and then sends the keys or decrypted data somewhere.


It could be, but why would the FBI want that?


Please read carefully. They don't have that capability. The FBI wants them to create it.


if they can create it, they have the capability.


They never say they can create it. They say the FBI wants them to create it, which may or may not be possible and is not addressed in this letter.


you've noticed auto updates yes?


it's a slippery slope.


under what kind of pressure would Tim decide to write this public letter?


Apple should be more clear that this is 5C and not the latest version.


Just unlock the freaking phone for them...


Instead of FBI paying apple engineers to hack a phone, why don't they ask their kids !? It would probably save millions of dollars.


Whats on the phone that isn't already on at least one cloud server and on NSA spy servers and telco records? That's the real question that everyone is carefully avoiding.

I mean, you could break into my android phone at enormous effort to use my phone to access my gmail app, but isn't it easier to just ask google, and I'm sure the telco and NSA are already logging everything anyway?

You could break into my phone to use my phone to use my facebook app to look at my uploaded pictures, but isn't it a million times easier to just contact facebook to access my facebook account?

In this new era of dumb terminals, its like a FBI agent demanding access to the terminal settings screen of a vintage VT102 in order to track terrorists or whatever. Or a demand to know my modem init string. It demonstrates a fundamental lack of understanding of the entire ecosystem from top to bottom.

The purpose of all this drama is to avoid discussion of the insecurity of cloud services.

It might be that apple people use the cloud a lot less than us android people. I'd be interested and surprised to learn that.


With the due legal process the police can search property, safety deposit boxes, bank accounts, vehicles, etc. etc. Why should a smartphone be any different just because Apple says it is ?

As much as I value privacy I really don't agree with Apple's stance here - if due legal process has been followed, why shouldn't they be able to read the contents of an iPhone ?

And yes I get that third party encryption can be used, which isn't owned by Apple and that there's little the authorities could do about it - but that's not the case at hand here.


The major difference is that a warrant to access a safety deposit box allows the keys for that specific safety deposit box and no other. What the FBI is asking for is the equivalent of asking for a master key to all the safety deposit boxes to access just the one box. Given what was revealed in the summer of 2013 by Snowden, I think we'd all agree that the FBI and other state agencies (not just American agencies) will use the software as a backdoor to access whichever iPhone they choose. Let's not be naive.


Well the FBI would have to have the iPhone in their possession to unlock it I presume. SO that's one level of security - I don't think the USA has become a place where property can just be confiscated without reason (I hope I am right here). If Apple were custodians of the unlock process then only once due legal process had been followed would an iPhone be unlocked i.e. Apple would own the unlocking mechanism. Maybe in the CEOs safe...


> I don't think the USA has become a place where property can just be confiscated without reason

Civil Forefeiture has been a problem for a long time.



Does Apple really have to create a "master" key though? Couldn't Apple write the backdoor that would only activates on the iPhone in question? Even if it was something as simple as "if (secure_id == terrorist_phone_id) [accept any pin]", it's not like the FBI could remove the condition without invalidating the signature. If they could, they wouldn't need Apple's help to begin with.


This is not the government asking to search a single vehicle or safebox, to take your examples. This is the government asking that every safebox or vehicle in the world be made instantly unlockable by design.


Exactly. I think OP is a bit fast and loose on the definition of area of interest.

It's not that it's bad if it's this case. It's that is bad if it applies as a built-in backdoor for whenever they feel like using it.


Am i missing something but aren't safeboxs and vehicles already made "unlockable by design"?

I am positive banks can open those things, and vehicles are inherently "openable", so I do not quite get the point here.


> With the due legal process the police can search property, safety deposit boxes, bank accounts, vehicles, etc. etc. Why should a smartphone be any different just because Apple says it is ?

Because the "master key" alluded to in the letter is ethereal and can be duplicated (as opposed to handed over). This means:

- Since the key can be duplicated, there is no serious way to ensure that only the police (or any other legally entitled organ) can do the search. Anyone who can get it will have police-level access to anything and will be able to offer police-level access to anyone. The "anyone" in question can be a former policeman who hates life, some script kiddie, the Chinese and so on -- and neither of these people are likely to care much about the "due legal process".

- The police can trivially search it without the owner's knowledge and without leaving evidence. The high costs in time and money are a reasonable deterrent for searching property, vehicles etc. without the due legal process. For an electronic device, that cost is practically zero.

Regarding the first point -- for what it's worth, a while ago, army regulations here required that all doors have a physical lock and key, even if they also had an access code, for precisely this reason. The access code or the card swiping were used to log access (i.e. everyone had their own card, access codes could be logged so that you at least knew when someone was entering etc.) but when a door was supposed to be locked for good (e.g. labs not in use during the night), they were locked with real keys and sealed with old-fashioned wax seals. The rationale was that breaking the lock required quite a little time (and maybe even some door banging), increasing the chances that someone who tried to break in would be discovered, and physical evidence of a break-in was fairly hard to erase, as opposed to a purely electronic break-in which was quick to do (just enter the code). I don't know if this is true anymore, nor how common it was outside this part of the world, but it makes some sense.


"the police can search property, safety deposit boxes, bank accounts, vehicles, etc." - that they can, doesn't mean it is right. If you read the entire letter, you'll see that creating the backdoor means tomorrow the Chinese government can see the nude pics on your phone (with 'due legal process', of course).


The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.


The issue they're talking about is that by creating a back door it's not limited to a single iPhone. Once there's a way in then anyone with that knowledge can use it on any iPhone to access encrypted data.

That seems a lot different from getting a search warrant and having the right to go through your belongings.


I don't think that the police can compel you to not make a very good safety deposit box, just to compel you to open it if you have the capabilities.

I agree that due process should compel Apple to unlock if they have the capabilities to. But no subpoena can beat math, right?

EDIT: I just realised that what the FBI is talking about is a backdoored version of iOS. To me the compromise seems to be writing the backdoored version but leaving it in Apple's hands (so Apple could send the FBI the data but not send them the OS). The only problem with that, of course, is that such a backdoored version could then be taken by a judge.

Honestly, it seems like the judge could force Apple to hand over the OS update signing keys to the FBI...


> Why should a smartphone be any different just because Apple says it is ?

Because it is possible to create a smartphone that is impossible to break into. The others it is impossible to create one that is impossible to break


Why do you assume that a "due legal process" will be followed every time, in every country?

Or maybe you think it's OK to do it just this once?

Sort of the "just the tip" mentality here?

The problem is that once such a capability is added to the OS, there is no going back. And it can then be used with or without your wonderful US due legal process, potentially by criminals and definitely by governments in countries where human and civil rights are a joke.


[In walk the drones]

"Today we celebrate the first glorious anniversary of the Information Purification Directives.

[Apple's hammer-thrower enters, pursued by storm troopers.]

We have created for the first time in all history a garden of pure ideology, where each worker may bloom, secure from the pests of any contradictory true thoughts.

Our Unification of Thoughts is more powerful a weapon than any fleet or army on earth.

We are one people, with one will, one resolve, one cause.

Our enemies shall talk themselves to death and we will bury them with their own confusion.

[Hammer is thrown at the screen]

We shall prevail!

[Boom!]

On January 24th Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984.'"

------------------------------------------------------------

Apple Superbowl AD "1984"

Transcription courtesy of George Gollin, 1997

Edit:Removed the link to the video. My goal wasn't to draw traffic anywhere it was just to point out that some of Big Brother sentences in an Ad aired 30 years ago still have strong resonance today.

"Our enemies shall talk themselves to death" Hum... just read yesterday that NSA is believed to use machine learning over cell big-data to determine drone target...


Not getting an iPhone, even secured - Check!

I bet hardware vendors are just salivating at the concept of having to produce thousands of iPhone cracking docking stations.


> The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no an way to guarantee such control

This is just pure awful they admit to helping the fbi. how can we trust them


This is quite unlike Apple. Is this the same company that insists on keeping its source proprietary and is always against FOSS? The idea that you care for your users' privacy and still like to keep control on them by not giving them the freedom to modify source-code is not what I buy.


You're talking nonsense. Protecting users' privacy and keeping source code closed aren't mutually exclusive.


Maybe not, but they are strongly correlated. Strongly enough to make an aggressive assertion like "You're talking nonsense" unwarranted at best.


Yeah. Loving your spouse and hiding things from them are also not mutually exclusive. Yet, you hardly find people that do both. Not to mention, this situation would never have arisen if iphone were open-sourced, since all activity would have been monitored by the community.


You have way too much faith into the the idea, that open source projects are free of backdoors / exploits and that the community can prevent creation of those. Aside from that, there is currently no platform, where 100% code is open source.


Of course we can't. But we are in a vastly better position to positivity affect these things if we do have the source.

It is immaterial whether or not any platform is currently 100% open source. rms_returns is right in noting the discrepancy between noble apple sticking it to the man and walled garden apple sticking it to the user.



yes





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: