Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft, Google, Facebook Back Apple in Blocked Phone Case (bloomberg.com)
888 points by sbuk on Feb 25, 2016 | hide | past | favorite | 240 comments



From recent polls most people think this debate is about:

"Should Apple be forced to give the Terrorist's phone data to the FBI"

When what it's really about:

"Should the FBI be able to compel technology companies to use their resources against their own and their Customers right to Security."

In order for the public to be properly informed about what's really at stake this case needs to be about the "Technology Industry vs FBI", not what the US Govt is trying to set the narrative to: "Apple protecting Terrorism". With this announcement this brings us to:

  - Apple
  - Twitter
  - Yahoo
  - Facebook / WhatsApp
  - Microsoft
  - Google (Sundar lukewarm)
Big technology companies with large user bases that's notably missing:

  - Amazon
  - Linked In


What about other large semi tech companies. Starting with Tesla, but GE, paypal, Asus, Linksys, Cisco, Sun, Bay Networks, Coca-Cola, IBM, FedEx, Starbucks, Procter and Gamble and all medical HIPAA institutions, schools and Universities, McDonalds, Disney, Pixar, Amex, Visa, MasterCard, Costco, Caterpillar, Target, Nike, UPS, USPS, Exxon, Pepsi, Samsung, AT&T, t_mobile, Verizon, Sprint, Oracle, General Mills, Wells Fargo, Investment banking companies, etc.

They all have a stake in tech, and they all would not like to have their tech backed up by a system, that anyone can get into if one person leaks it out. DMV, the police, all those poorly secured systems that are now basically a new RFC that is for publicly routable address space that everyone treats like private address space, but just insert your special gummy bear finger print copy, and bam, you are in.

Get all those companies behind this, and people will think the FBI are crooks and theives.


I wonder if they could make a case that it violates the HIPAA, since your heartbeat/activity tracker info is on the device? Kind of a weak argument, but HIPAA is a pretty contentious subject.


I might be wrong, but my understanding is that HIPAA privacy rules only apply to healthcare providers like hospitals and their employees.

https://en.wikipedia.org/wiki/Health_Insurance_Portability_a...


This is correct. We ran a website for diabetics where users volunteered health data and HIPAA didn't apply to us. If it coordinated data exchange with doctors directly then it would be different.


What about doctors using an iPhone to check a patient's files? Would apple have to make a secure HIPAA compliant version for hospital use? What about government use? I bet all those FBI agents are talking to each other with iphones...


> I wonder if they could make a case that it violates the HIPAA, since your heartbeat/activity tracker info is on the device?

HIPAA privacy protections only apply to data held by HIPAA "covered entities", which are mostly insurers, health care providers, and their business associates, and they restrict disclosures by those covered entities.


AT&T is basically a full voluntary partner in the surveillance state.


your right



updated, thx!



I think the video on that page would be incredibly more effective and helpful if it didn't use "encryption" so many times, and tried to explain to people what encryption is. As it stands, it just repetitively uses the term to tell people when, where, or how they use it without realizing. I think the former point is more critical, and the latter less so.

As it stands, meeting encryption feels more like spotting Waldo than actually meeting encryption.

(PS: Hi, James!)


When what it's really about:

Are remote "software updates" to a buyer's phone, under the sole control of the seller of the phone, at any time (even after purchase), for any reason (e.g. the one being proposed here), a good idea?

Why? Why not?

Should Apple customers be given the option to flash the hardware they purchase from Apple with their own choice of software, whereby they might better control the HW and hence secure it from Apple and other third parties?

Or should Apple have the _exclusive_ right to flash the HW they sell, even _after_ purchase, without the customer's consent? As they do now.

Open hardware. Bad? Good? Necessary?


Heh. Open firmware would not solve this issue. If the firmware were open, the FBI would have already broken into the device. Apple's signing key (or, for the paranoid, the government's unwillingness to admit that they have obtained it through espionage) is primarily what is keeping the FBi's fingers out of the cookie jar.

A better question is how to further lock the phone down in future updates so that Apple can't comply even if they were compelled to, something Apple engineers are no doubt looking into right now. The optimal way would be to require a device to be unlocked to update the firmware, which would require the PIN, which Apple would not have.


Open doesn't mean it's an unlocked door.

Open means that the owner of the device can replace the firmware. There's nothing not-open about requiring that the owner authenticate first.

It's only not open if the vendor, rather than the owner, is the only one who the device will permit to replace its firmware.


Okay, so how do you verify that only the owner can do that, on the OS level, if the firmware is not first totally locked down? The security of iOS devices comes from the fact that Apple has engineered the devices to behave a certain way when using Apple firmware and that that firmware can't be modified or replaced without the signing key of Apple. And Apple is making it clear that they are on the side of the consumer, not the three letter agencies.


You use the exact same design as iOS, but with unique keys in the hands of each device owner (in other words, a PIN or password) rather than one key in the hands of the vendor.


> means that the owner of the device can replace the firmware

It means that a small minority of owners of the device can replace the firmware. Most people don't have the resources to do something like that.


The resources required to replace the firmware on a non-crippled Android device is access to the Internet from that device. This is not a big ask. Most people who own an iPhone have a data plan and/or access to WiFi.

Sure, only a small minority of device owners can actually make new firmware, but fortunately only one of them needs to make and distribute some firmware for everyone to be able to install it, if the device is open.


Are you saying this as someone who has helped non-hobbyists install new firmware onto their phones? I haven't, but helped many people install desktop linux around 2008-2009 and was often surprised by the problems they ran into.


Most people don't have the resources, knowledge, or desire to replace their car's engine parts. Does that mean it's ok to sell a car with a locked hood, that requires a manufacturer-controlled key before the engine can be serviced?

No, even if they personally don't want to touch the engine, they turn to knowledgeable friends/family or pay a professional mechanic to do the work.

This unfortunately-common view that most people should never even have the option to do anything technical is insulting and maybe even monopolistic (it bans 3rd party repairs/parts).


This strikes me as a great opportunity to use the "think of the children" argument against the government. Headlines should read like this:

"FBI wants ability to steal dick pics from any iPhone, including children's phones!"


Sure, but that goes both ways. It could easily be said that the FBI needs to be able to look through the phones of predators in order to convict.


I think they're talking something similar to GCHQ's intercept.

For those who are new, previously on hacker news https://news.ycombinator.com/item?id=7312212


I think you may want to rethink your soundbite. Heh.

But yeah, remember the scandal about school admins potentially being able to use the laptop's anti-theft programs to spy on children? Imagine that for all children worldwide.


> "Should the FBI be able to compel technology companies to use their resources against their own and their Customers right to Security."

For europe and the rest of the world, the answer was a resounding YES by the american government, their intelligence agencies and the american people at large, and it came years ago. So, this is not about their customers or their rights or any other abstract moral universals (which are valid only within the US). I wouldn't read too much into it, although i think it has to do with an attempt to restore the shattered image of US companies to their international audience.


So you are assuming the court will find against Apple? And that America will watch this and say they are cool with that?

The first is a possibility, of course. I don't think the second is likely. I personally do not plan to stop shouting about this, and informing my representatives on my position.


> "Should the FBI be able to compel technology companies to use their resources against their own and their Customers right to Security."

That's far too long and complicated for average citizens to parse and grasp implications immediately and intuitively. The second half of your question veers heavily into leading a person toward the conclusion you want them to have. Not ideal. Moreover, I think an average poll responder would get confused by the second half of the question as you've phrased it.

Perhaps try to phrase it like a simple headline or poll question that gets to the heart of the matter:

"Should government be able to force companies to change or disable your passwords?"

Here, I think, you have a much greater chance of grabbing the attention of the casual onlooker. Admittedly, it's a mostly true statement of the core issue here, while leveraging the need to hook a reader at a glance. An article/reporter/anchor can dive deep into the case particulars or not, cover and present its view of what's at stake in asking the question, include discussion of a "right to security" that may or may not be enjoyed by companies and individuals. But you want citizens to immediately grok what you're asking, and be able to, on their own, extrapolate possible implications that matter to them. This is going to be far more powerful in getting people to invest themselves in the issue--they must be able to imagine, see, understand that what's at stake could affect them.

The first half of the battle is this: You've got to get people to fucking care. And you can't get them there if you confuse or lose them at the start.


> "Should government be able to force companies to change or disable your passwords?"

And phrased like that, everybody will answer yes.

The dire implications of this case are not that easy to explain to someone who doesn't understand the basics of encryption/security. Welcome to give it another try, though...


> And phrased like that, everybody will answer yes.

The normal, non-technologists I know would most certainly find the idea that their passwords could be changed against their will--or, in the absence of their assent after death--pretty unsettling and inherently wrong. It's relatively trivial to provide analogous (though not entirely homologous) situations--e.g., should the government be able to force your mortgage company to change the locks on your house so they can get in to execute a search warrant?--that people would have immediate pause and find it worth considering. I don't know many people who would just thumbs up the suggestion that Facebook could be forced by the government to change/disable their account password. Or their banking passwords, email password, etc. If people became aware of a national conversation focused on the government trying to secure the power/precedent to force Facebook, Google, Apple, etc. to change/disable their passwords via a court order, I think very many more people will refuse to accept such an issue at face-value.

> The dire implications of this case are not that easy to explain to someone who doesn't understand the basics of encryption/security.

HN is a predictably poor place to discuss gaining widespread understanding by the masses on certain things. We too often fall into the trap of thinking that people must understand technical details of some thing X in order to understand implications of related thing Y. I think this is nonsense. People don't need to understand all the math and physics behind rocket science to understand that an unexpected explosion during liftoff has dire implications. We can explain that to people without ever showing them an equation, or talking about scientific laws/principles. There is no reason we cannot have a meaningful and productive conversation on this issue without losing the majority of American citizens as soon as we bring up the technical details.

Technologists haven't gotten much of anywhere getting the average American to understand, much less know why they should care about, encryption. Hell, we still engage in arguments here on HN about why you shouldn't roll your own encryption, even though all the programmers out there should already know that. We simply aren't going to get anywhere saying we first have to educate normal people on encryption. And it's plain wrong to think that an issue like this needs to be framed within a context that requires people to understand the basics of encryption. That's a no-win trajectory for carrying on a public conversation. You simply have to appeal to people using language that isn't overly biased, presents issues in the simplest possible ways (and builds up from there), and actually gets them involved in the conversation. It would, without any doubt, be fantastic if more people understood encryption, digital security, digital privacy, etc. But Average Joes and Janes don't really think about the devices and services they use in that way. I know a great many people who are [what I think are] normal, non-technical citizens, and their eyes just glaze over at talk of encryption--much like how many people's eyes glaze over at seeing mathematical equations, as if they're looking at Japanese.

The reason there is a higher propensity among potential Average American poll respondents to support the FBI on this issue is the way it's being framed. All I'm saying is that it's just as possible, and actually necessary, to frame the issue in a way that helps average Americans understand the dire implications of this case without having any need for understanding the basics of encryption, iOS, the Secure Enclave, trusted/signed code from Apple, and everything else that most HNers are discussing here.


> We too often fall into the trap of thinking that people must understand technical details of some thing X in order to understand implications of related thing Y.

Is that what you got from my post? Because it's not what I meant at all. I don't believe people have to understand the technical details behind it, but they do have to understand that the FBI forcing an unlock sets a very bad precedent which would massively diminish the security of the iphone/next iphone and this would apply to everybody, not just the tewwowist, AND that they're using a super old obscure shady law to back all that. There is a lot of data to understand and you can't easily grasp that from a single, short sentence.


Ah, my apologies then. This explanation sounds very different from your original suggestion that people need a basic understanding of encryption/security. I think we are quite close in agreement here.


I can see how my comment was misleading. My point was that those with a ground knowledge of encryption/security would most likely easily deduce this - but that doesn't mean you need such knowledge to understand what's going on.


I think Tim Cook's letter put it well. We should be focusing on the specific law invoked here. The All Writs Act says that people need to hand over all information pertinent to a crime investigation. Traditionally, this has been interpreted to mean all documents that already exist. Should we expand our interpretation to include information that doesn't exist but could be created?

If you had the ability to create information that could help in an investigation, should the government have the power to compel you to create it?


Oh, I absolutely agree. This is somewhat closer to what I'm trying to invoke here, but it's slightly different, and requires a different approach to engage the average American citizen in the conversation.

I think getting people involved in understanding this aspect of the case is critically important, as this is the part of the case that would be precedent-setting and [I believe rightly] concerns Apple and others with the requisite technical background. The danger is this precedent could be extended in untold tangential ways outside the realm of preventing device-wiping after n failed password attempts.

The trouble here, to me, is a majority of the average citizen who wonders what the All Writs Act is aren't going to bother finding out. They'll just accept the government's position that it allows them to ask for what they're asking. A smaller minority of those same citizens might go look up the All Writs Act, are probably not going to understand what it means. "What's a writ?" Hmmm. "What's "an alternative writ or rule nisi"?" I don't even understand what this means, much less how it applies. Oh well." Average people aren't going to become armchair legal (or encryption) experts overnight to resolve in their head where they stand on this issue. Thus, they're probably going to choose the path of least mental resistance, which will look something like, "Well, it is a terrorist's phone ... and I'm not a terrorist, so this probably doesn't actually affect me in any way ..."

Anyway, I think you're definitely on the right track with this part of the order, and the specific law invoked. We just have to get that into simpler terms an average American can understand, minus the expert/legal/technical jargon, and then get them to understand the implications. Your questions and phrasing are pretty good starters, I think. I can imagine, with perhaps a wording change or two for personal taste, I could ask a non-lawyer, non-technologist that question and they'd react with something like, "Why the hell should the government be able to force me to create something that doesn't exist so they can conduct an investigation?"

And that's exactly where we want people to be on this part of the issue at hand.


It's an interesting case because the people who are most worried about terrorists tend to be the people who are also most concerned with the government expanding power, especially by reinterpreting old laws. It needs to be clear that this is about expanding government powers to force people to work for them because so far the War on Terror has been a way to get past the opposition to expanding powers in most cases. But there has to be a line at some point where people think that it's been taken too far, no matter how scared of terrorists people might be. This idea of forced work might be that line.


"From recent polls most people think this debate is about..."

...or maybe people understand the debate, and they simply disagree with your opinion on the matter.

For example, even as phrased in the second form, my answer is "yes, sometimes", with an additional "there is no such thing as a 'customer's right to security'", added for good measure.

Even if I believed there were an absolute "right" to security, I don't think it's Apple's job to enforce it.


The thing is, if privacy advocates lose this fight, we have a greater chance of losing the next one.

I agree it should be the user's job and not Apple's to enforce security, however I also think that is something that we can focus on later. We can both support Apple now in this fight against government overreach and put pressure on them later to make their phone securable by the user. Those two ideas are not mutually exclusive, despite appearing to be both pro Apple and anti-Apple.


"The thing is, if privacy advocates lose this fight, we have a greater chance of losing the next one."

I don't think that's true, but even if I'm wrong, you could say that about any skirmish in the privacy debate. Every legal case sets some precedent that affects the next one.

Also, I don't agree that this is government overreach. This case seems about as clear-cut an example of a "good" investigatory behavior as we're likely to see. If you can't get behind the idea of compromising privacy when there's a valid warrant, in the case of a known mass murderer with probable ties to international terrorist groups, well...you've set an exceptionally high bar. I'm a reasonable person, I see your position, and I disagree with you on the merits of your argument.


>> "The thing is, if privacy advocates lose this fight, we have a greater chance of losing the next one."

> you could say that about any skirmish in the privacy debate. Every legal case sets some precedent that affects the next one.

Yes and no. The role of the judiciary is to interpret and implement the law. They are not supposed to create law. That is the job of Congress.

This case is so unprecedented that there's a chance the precedent here will effectively create a law. It will come down to a matter of interpretation by the justices. If they rule against Apple, the message is, "This is covered by the AWA, and the AWA was intended to compel companies to weaken their products' security". In that case, many of us in the public feel that the justices will have unfairly created a new law.

If they rule for Apple, the message is, "This is an undue burden being forced upon Apple. That is not the intention of the AWA. If you want them to comply, go to Congress and ask for a law"

Obama was a lawyer, he knows this, and in light of the public's disapproval of mass surveillance, he chose the path (the AWA) that is most likely to get him what he wants, which is to grant law enforcement access to all iPhones. The public currently will not re-elect congressmen and women who push through new surveillance laws.

P.S. I like your site vayable.com


>This case is so unprecedented that there's a chance the precedent here will effectively create a law.

This case is actually super with precedent, if you ignore the digital nature of things. Apple has keys to a safe (digital signage key for iOS updates), FBI has warrant to search safe, court orders keys to safe. Fourth amendment doesn't apply because Apple can't incriminate themselves by giving signage keys.

There is the whole "asking Apple to write the OS changes and sign it", instead of just "Asking Apple for the signing keys." I would call this a compromise ruling to avoid having Apple have to give the actual for real skeleton key.

Asking a judge to not force Apple to surrender the digital signage keys requires more subtle arguments about burden. Asking for any ruling on encryption itself is asking a court to legislate from the bench


The precedent you reference doesn't really apply because, as you say, the FBI isn't actually asking for the signing keys.

The big difference between what's being asked and the signing keys is that Apple already have the signing keys. There is certainly plenty of precedent for a court to ask Apple to produce something they already have, but less precedent for a court to ask Apple to make something that doesn't exist yet.

In the analogy, the FBI doesn't have a warrant to "search the safe". What is protected by the signing keys is the ability to distribute iOS updates, but the court wants the data on the phone, which is protected by a different key Apple don't have.

A more correct non-IT analogy is to imagine that the safe has two locks: one pickable, the other unpickable. Apple don't have a matching key for either, but knows how to cut a key for the unpickable one. The FBI has asked Apple to set up a key-cutting facility, definitely only for use on this one occasion, so they can make a key for the unpickable lock.


> if you ignore the digital nature of things

That is a gigantic "if"


There isn't a valid warrant in this case though: instead there's a writ (aka an order by the government) under the All Writs Act. This act allows the government to compel a person/company to do something when there isn't any specific law on the books for that particular situation.

However, the writ in question is to compel Apple to write a new version of the OS (not to hand over keys, not to hand over data in their possession, not to provide technical support to the FBI - all of which are things Apple has done when presented with a valid warrant).

From Apple's motion to vacate:

"In the section of CALEA entitled “Design of features and systems configurations,” 47 U.S.C. § 1002(b)(1), the statute says that it “does not authorize any law enforcement agency or officer —

(1) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.

(2) to prohibit the adoption of any equipment, facility, service, or feature by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services."

So the writ appears to be in direct contradiction to other law, which would make it an invalid use of the All Writs Act.


No, you're completely wrong. There's a valid warrant. The owner of the phone also consents to the search. Apple's refusal has nothing to do with any of that.

What's happening here is that Apple is claiming that it can't comply with the warrant, because encryption. The FBI is invoking the All Writs Act to try to compel Apple to brute force the weak password on the phone.

http://fortune.com/2016/02/18/fbi-iphone/


But that's just an outrage line. Not a necessity line. There's no ticking time-bomb.

Nobody expects to get anything from the phone. It's the guy's work phone and they already have all the call logs, etc.

I hear that you're willing to trade liberty for safety, but what safety do you expect to get and are you considering any costs?


> Even if I believed there were an absolute "right" to security, I don't think it's Apple's job to enforce it.

Well, do you not believe the First Ammendmant protects against compelled speech?

Because the FBI's argument here boils down to "we have the right to require people to cryptographically sign things against their will"


Justice Scalia, 1987; “There is nothing new in the realization that the Constitution sometimes insulates the criminality of a few in order to protect the privacy of us all.”


I don't see anyone anywhere suggesting it is Apple's job to enforce security. Perhaps I misunderstand what you mean by that phrasing.


A good point. This is about where public and private ends on the internet, and that debate is still very fluid.


> "Should the FBI be able to compel technology companies to use their resources against their own and their Customers right to Security."

We can put the case into an even larger context and ask: should the FBI be able to compel [banks, accounting firms, insurance companies] to use their resources against their own and their Customers right to financial privacy? The answer to date, has been "yes." The right of the justice system to "every man's evidence",[1] has so far been thought to trump the interests of the customer relationship.

For decades, government has had the power to compel accounting firms and banks to search through their records (often at significant expense) to deliver information relevant to an investigation. I imagine those companies would love to be able to advertise that they won't cooperate with a government investigation of your taxes or financial transactions.

In my view, there are two different questions:

1) can the government ask Apple to unlock one iPhone; 2) can the government ask Apple to weaken security for all iPhones?[2]

The first question, at issue in this case, is something companies deal with all the time in many different contexts. The second question, on the other hand, may well be sui generis. I understand why Apple taking a stand on this particular hill--in their shoes I'd do it too. Better to "fight them over there so you don't have to fight them over here" so to speak.

I'm a little worried about the potential for blowback, though. The tech community is forced, in a way, to spread FUD to justify their argument as to (1). There are a lot of people who are confused by the reporting and think that this case involves the FBI requesting Apple to put backdoors in all of their phones, rather than leveraging an existing security weakness. That's not necessarily a great place to be in if you're found out.

[1] United States v. Nixon, 418 U.S. 683, 709 (1974).

[2] I don't think it's technically justifiable to say that (2) is implied by (1). Apple apparently already has the capability to subvert anyone's phones by using their signing keys to load weakened software. At a purely technical level, signing an "evil build of iOS" using an existing "backdoor" is different than introducing a backdoor into everyone's phone.


(2) is implied by (1) because once Apple has created the tool to do it once (1), the government has access to it and can (and will) use that tool endlessly (2). This is why Apple refuses to create this tool to unlock 1 iPhone.

History has shown we cannot restrict this by implementing laws/rules, just look at the news to see how these laws are abused right now, accessing all our private data.

EDIT: ok i get your point now; I imagine the reason this 'technical backdoor' exists in the first place, is because Apple wants to be able to update that part of the OS in case of bugs. If they make it 'not-updateable' the backdoor is closed but then if problems arise, bugs can't be fixed (for example the recent Error 5/home button bug).


You are right about the government being able to compel some action. Where it gets tricky with Apple is around the argument that code is speech. The government can compel an entity to do something, but they typically can't compel them to say something.

Even though this is incredibly unlikely, say for a moment that Apple has only hired developers that feel strongly that privacy is an inalienable right. If Tim Cook concedes to the FBI and gives the order that a hacked version of iOS is to be created, what happens if his developers refuse? Would Tim Cook be put in jail, or would the developers be jailed on contempt of court charges?


What about bank employees who feel strongly that financial privacy is an inalienable right? Do they get to avoid complying with government investigations into tax evasion or money laundering?

I see your code is speech point. I'm skeptical about it, because usually code is just code. It's meant to do something, not to express an idea. But I think there is some merit to the idea that code signing combines doing something (enabling code to be loaded) and communicating an idea (that the signer trusts the code). It's an interesting argument, certainly.


There are many issues here which have thus far been conducted in a legal or nebulous grey area of the law or flown under the notice of the general public for far too long:

Can the government compel the creation of original tools and IP outside the typical operations of a business? If so should they be required to compensate the business? Does the business then own the IP or the government? What happens when compelling a company to create IP or perform process outside of its typical operations is a net loss regardless of compensation? Should it be required to notify stockholders? Is it legal to make investments based on this PARTICULAR kind of inside government information? What if the entire company's operation may be plausibly undermined by the operation?

I think a more relevant example here would be whether the government could compel a bank or financial company to create or alter its financial reporting tools to falsify data or misreport to the subject of an investigation.


My understanding is the government is generally supposed to pay for the cost of external requests to private entities, should they choose to bill for it.... Unfortunately, the cost of compliance for Apple is lost future revenue, not just the technical costs of the gimped iOS version.


The specifics are important. If I said, "If a police officer gives you a lawful order do you have to comply?" you would immediately say, "yes". Now what if I said, "If a police officer demands that an entire neighborhood be subject to lengthy interrogation and be held for 24 hours in custody in order to solve a murder that took place there." You'd probably say that's ridiculous.

That's essentially what I view to be the case here. In theory the procedure may be perfectly "lawful" but in practice is clearly overreach. We need case law around this. I believe this is as clear a case of overreach as Apple could wish for. Finally, I believe it should be legal precedent set in public for once.


I'm not saying it isn't overreach, I agree that it is... the 4th and 5th amendments indicate this, as well as other reasonable interpretations of the law.

I was only responding to the gp, in that it would be billable, but the real "cost" may be incalculable... not to mention the damage to near term and long term security, and future incursions into privacy built upon yielding here.


Requesting a trivial modification to a piece of software is overreach?


It becomes non-trivial due to the marketing of their devices as secure systems. Just as with the burden of proof in defamation actions is higher in the case of a public figures, the "triviality" of the modification in this case is held to a higher standard due to the purpose of the product.

* I can't reply below this threshold but I'm enjoying the back and forth. People buy iphones because they believe that there are no ways to "trivially" access their personal information. Even for Apple to do so would be a breach of that trust. See: "Uber God Mode"


Apple can't access your personal information unless they are in physical possession of the phone. And, whether they have built the GovOS piece of software already or not, is essentially irrelevant to how easy it is.

If the only reason they can't get at your data is because they haven't compiled some code yet that's ALMOST NO PROTECTION AT ALL! Anyone making a buying decision based on Apple's failure to compile a simple piece of code is nuts!

Like, if you gave me a private document while I had my eyes closed and I said "Trust me, I can't see your data, I have my eyes closed. And I'm not going to open them." that would be crazy right? Building GovOS is the equivalent of opening my eyes.


I think you underestimate the value of trust in business and government. Its a common failing in the hacker ethos. Just because you CAN mess with the programming of an elevator the president uses every day doesn't mean you SHOULD.


People buy iPhones because they think that the FBI can't ever compel Apple to extract data from them?

And even if that's true (which I really don't think it is) it's not a good enough reason.

Say I started a bank and marketed my bank as an especially secure one. "Even the FBI will never be able to see your bank records. I won't let them!" I bet some customers might be interested in that. Just because I had marketed my bank that way doesn't mean my bank would gain some special legal protection against warrants.


The big difference between the two is that the banks have the information and are already subject to lots of record keeping regulations and so it is not an undue burden for them to comply with a subpoena demanding information. Apple does not currently have what the FBI is asking for and to comply would mean creating that software.

If the government wants to pass a law requiring phone makers like Apple to include a back door, then that's what they should do.

For me, this boils down to whether or not people are allowed to have strong encryption.


> For me, this boils down to whether or not people are allowed to have strong encryption.

Its easy to see why DOD originally classified it as a munition. Video games have become a powerful tool for storytelling, education, performance art, and immersive experiences for a long time but they started out as artillery simulators and a means to practice war games. That mentality pervades and we still see Jump, Shoot, and Run as the primary dynamic by which we function in these amazing experiences. You're fighting against human instinct and hundreds of thousands of years of evolution. Against Dunbar's number, the core mechanic of trust and social engagement for millenia. I hope you're right, but I don't think it will be the sea-change you're expecting. Look at the adoption of bitcoin and remember that law enforcement is a retro-grade function of society. Underfunded and always behind the zeitgeist.

We don't ask the secret service to build the president's merchandising website, and we shouldn't expect the FBI to be able to articulate or comprehend the, "triviality" of undermining strong encryption.


I think it was classified as a munition for practical reasons - they wanted export controls.

I think it's great that you name specific parts of the government because not all parts of the government think strong encryption is a problem.


All I'm concerned about is a bottle. Take a little walk to the edge of town across the track. Where the viaduct looms as it shifts and cracks. Where the border lies. Across the stack. On a gathering storm comes a handsome man with a red right hand.

4PM in Chicago. I'm out of here.

https://www.youtube.com/watch?v=EdhoX1Xu6ZI


> For me, this boils down to whether or not people are allowed to have strong encryption.

That's an absurd point of view because the phone in question doesn't have strong encryption(1). If it did, there would be no way for Apple to trivially bypass it.

(1) Well it kind of does but by encouraging the use of an incredibly weak password it defeats the purpose of having any encryption at all really. If the San Bernardino shooters had used a real password we wouldn't be having this discussion.


We have no idea what kind of password is set on that phone. If the shooter used a good password (he probably didn't), then no matter what Apple does, the FBI is still going to be locked out.

The chance of there being anything valuable on that phone is pretty slim. Especially when you consider how much the government is willing to spend on it. I was just watching CNN and Apple has apparently detailed what it would take before they would consider writing the compromised version of the OS. The reporter claims it would cost about $50 million and the government has apparently said "no problem, we can pay that". Ugh.


I won't cost 50 million dollars to comment out a few lines of code and recompile the OS. That reporter has no idea what they're talking about.


To be fair, neither do you.


Sure I do. I've been a professional programmer for almost 20 years. I have a pretty good idea about what sorts of things are hard to do and what sorts of things are easy to do. The FBI's requests are very easy. 50M is ~200 man years of work. That number is insane.


Have you ever worked some place that requires a security clearance?

Well, for Apple to develop this software, they would insist on it being developed in a locked down facility that doesn't currently exist (what Apple has is in use). I think they have said it's only about $100,000 of developer time (so maybe 2-3 months of work), but the infrastructure that would be necessary for the work to be done in a way that doesn't risk any leaks is expensive.


Can you make up some more facts about Apple's real estate situation and software methodology?



https://www.washingtonpost.com/news/the-switch/wp/2016/02/26...

Apple has made the claim that code-is-speech. From that article:

> Wayne Giampietro, a Chicago-based lawyer and longtime member of the First Amendment Lawyer's Association, said that Apple is absolutely on solid ground with its argument.


Isn't the difference that the banks already have access to and collect this data and Apple does not currently have access to this data and it would not normally collected in the course of business activity?


You keep using Banks as an analogy. It fails because Banks are used to drilling out locks on flimsy safe-deposit boxes, Apple is not used to hacking old phones via firmware updates.

Nobody is saying the government can't attempt to unlock the phone - or even compel Apple to provide technical information. But that's not what the FBI wants. They want to force Apple to take all the risk.


Like it says in the video, it's mostly a business decision. All the companies in the top list are customer/user facing, while Amazon's AWS is not customer facing at all. It's about the customer's impression of the platform they are interacting with on a daily basis (presumably not Linkedin). Standing up with Apple on this issue shows their users they are (also) serious about their privacy.

Turn the tables for a second:

What would the US do if a foreign government had access to their citizens' information?

Would it become an issue of nationalism to use your own country's homegrown smart phone, browser, social network, etc?


it's is mostly a business and pr decision

"Apple had asked the FBI to issue application for iPhone passcode cracking tool under seal, but government made it public, prompting Tim Cook's public remarks"[1]

so no, apple doesn't really have the security of user in mind, it's just turning this on its head and putting up a show

[1]http://www.nytimes.com/2016/02/19/technology/how-tim-cook-be...


Twitter, Yahoo and LinkedIn aren't imperatives as their beacon data is likely the most important piece and the rest already cover that, and twitter is the only one that will be a company in a year.

Amazon, Google, MS and Apple are the real players, but Facebook is closing in as well. Google is the main linchpin, within 1-2 degrees everything clears through it: CA, DNS, Search, mx, analytic beacons, OS, hosting, etc.

They believe data should be used to solve problems and it should. But the consolidation risk is too high now, they own the whole stack and all the connections hopefully they have integrity here.,


No, it's not just that. It is also:

- Should companies reveal what type of information they have on their consumers.

- Should and can companies protect their data on their consumers as their own property.


Unfortunately those have already been decided and are unlikely to be reversed by this case. This case is currently about the AWA. A previous decision using the AWA, USA vs. New York Telephone, notes it can compel companies to act, given some constraints,

"We agree that the power of federal courts to impose duties upon third parties is not without limits; unreasonable burdens may not be imposed"

"The order provided that the Company be fully reimbursed at prevailing rates, and compliance with it required minimal effort on the part of the Company and no disruption to its operations." [1]

We already know law enforcement can get a warrant for access to information, and that people cannot reasonably expect privacy of information given to a 3rd party, including information on computers. Those decisions, unfortunately, have already been made and that's not what's at stake here.

[1] https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...


Thanks, you have reminded me some huge IT companies that can also be related with this case! Amazon and the other IT infrastructure companies. Why are they missing or remain silent? Does it implies anything?


If the government wins this case, I will permanently destroy my personal cell phone and will never have one again. I will maintain a cell phone for work only.


That's a bit extreme. Just because there is a weakness in the implementation details of the iPhone PIN mechanism does not mean that all phones are weak. Apple already uses extremely strong encryption, and when coupled with a largish alphanumeric passcode, the phone would not fall to such easy attacks as the phone in question (4-6 digit PIN).

This case is not about weakening encryption in any way, something that I think is not well understood outside a technical audience.


That's a bit far. I'll just stop using mine to plot major crimes.


Snapchat?


mm


I didn't see anything wrong with 'Should Apple be forced to give the Terrorist's phone data to the FBI'

It is what it is. You guys focuses more on how to limit FBI to abuse the terrorist card to violate privacy. But in case of a true terrorist, the answer should not be not that hard.


This is exactly what the FBI want's you to think and why they've carefully handpicked this case to try and set this legal precedent on - ridding us of our fundamental rights to privacy and security.

Out of all the 13 pending cases the FBI have with Apple using the All Writs Act (they've also said they have another 175 iPhones to unlock), this has the least chance of holding anything meaningful given it's his consent-to-monitor work phone, not his personal phone (which he destroyed), wasn't used at the scene of the crime, of which the FBI already have 6 week old iCloud backups of and would've had all the phones iCloud backups if they hadn't disabled them by instructing for the phones password to be reset.

Yet of all the pending cases under seal this is the one they've chosen to break character and go public on, exploiting this tragic event and using the fear around terror and emotions of victims for maximum PR and political effect to further their agenda of violating the security of millions of customers - which is both deceitful and clever, as apparently it's working.

I've always watched in bewilderment at the US Govt's response to terror, they're prepared to spend trillions fighting wars, billions changing Airport security, aren't prepared to do anything to change the easy access to the semi-automatic weapons that were actually used to kill all the victims, but are prepared to compromise the security and privacy of millions of peoples lives, violating 1st and 5th amendment constitutional rights for the theater of accessing useless data on a dead terrorist's work phone - for something which US citizens have less chance to die from than their own furniture? ...and the US public majority approves.


> trillions fighting wars, billions changing Airport security

That's only the means, not the goal, because...

> bewilderment

Spending massive amounts of money on wars makes a lot more sense once you realize that it's intended to channel money into their friends (and their jobs post-public service) in the military-industrial complex. Eisenhower warned[1] us, but the public didn't listen.

[1] his farewell address was amazingly prescient: https://www.youtube.com/watch?v=CWiIYW_fBfY


> But in case of a true terrorist, the answer should not be not that hard.

In the cases of true terrorists, having access to phone records and even having the father of a terrorist report on his own son hasn't been enough to stop attacks.

So, you should be asking whether or not you should be giving up your right to privacy and awarding the government broad powers that, based on historical data, won't reliably help prevent terrorist attacks, but that will be used far outside the original scope for which they were granted.

You should also not just be asking about the current government, but all future governments, and be able to trust that they will only act in the best interest of their citizens.


> I didn't see anything wrong with 'Should Apple be forced to give the Terrorist's phone data to the FBI'

Thats really not even remotely a legitimate question in this instance. If Apple "had" the data Im sure they would give it up readily (and probably have already in this and other instances). But they physically or digitally do not have the data to give.


But the FBI's request isn't limited to only phones used by categorically proven terrorists. They want to be able to unlock and unencrypt phones that pass a much lower bar regarding evidence of criminal activity - arguably to the point where they have a backdoor to any phone they want to look inside of. That's why Apple has to fight this case. It isn't about terrorists, that's just the PR spin the FBI are using. It's about much more than that.



Harry Shearer did an excellent mini-series three years ago called "Nixon's the One". It's just a verbatim reenactment of conversations from Nixon's tapes ( https://www.youtube.com/watch?v=f9HtoWea72A&list=RDf9HtoWea7... ).

There is more than one scene where Nixon talks about how he wants to get rid of J. Edgar Hoover but can't. In another the Attorney General calls FBI agents "the Gestapo".

It reminded me of another tape - a phone conversation LBJ recorded of him talking to another Attorney General - Robert Kennedy. Kennedy talks about how his subordinate, Hoover, was not only not taking orders from him, but was having agents monitoring, writing reports and spreading disinformation about him ( https://www.youtube.com/watch?v=aVHnkIPGC6M ).

Forget the FBI monitoring non-violent political figures like Martin Luther King Jr., or Vietnam peace groups - presidents and their attorney generals began to fear their power.

The Church committee was supposed to fix this, but it was stonewalled in many ways, and by the 1980s we saw the FBI revive these political witchhunts again against groups like CISPES, and even groups run by Catholic nuns concerned about the rapes and killings of Catholic nuns in Central America.


What's the worst that could happen for any of these companies or their senior executives if they refuse to cooperate?

Most corporate criminal penalties we hear about these days have become minor costs of doing business. And it's not like the olden days where senior execs often ended up in jail for corporate wrongdoing.[0]

The largest monetary criminal penalty I could find after a quick search was the 2013 settlement between JP Morgan and the Justice Department for $13 billion.[1] That's about 25% of Apple's net income last year, hardly a death sentence.

EDIT: I am not a lawyer so I'd appreciate any insight on this question.

[0]: http://www.nytimes.com/2015/02/20/business/in-corporate-crim...

[1]: http://blogs.wsj.com/moneybeat/2014/06/23/a-list-of-the-bigg...


I'm not a lawyer but the powers of Federal court are very broad when it comes to contempt:

https://www.law.cornell.edu/uscode/text/18/401

http://www.bafirm.com/publication/federal-contempt-of-court/

From the law: "A court of the United States shall have power to punish by fine or imprisonment, or both, at its discretion [contempt of court]"

From the article:

- "While § 401 specifies neither the minimum nor maximum penalty that may be imposed, it does prohibit a federal judge from legally imposing both a fine and a sentence of imprisonment."

- "Although there is no statutory maximum limit regulating the amount of time a contemnor can be ordered to spend in confinement, the requirement that a jury trial be granted in criminal contempt cases involving sentences over six months in jail acts as a check on this power."

- "In turn, Section 2X5.1 states that the court should apply “the most analogous offense guideline.” As a result, a court will be required to make a highly fact-specific inquiry when determining the appropriate punitive sentence. As an example, courts have equated the refusal of a witness to testify with all of the following: Obstruction of Justice; Misprison of Felony; and Failure to Appear by Material Witness."

While the powers of the court aren't unlimited, they could certainly damage Apple to force it to comply, since there's no hard limit on the penalties that can be imposed and Apple continually refusing to comply could justify harsher and harsher penalties.


Why do you assume the penalty would only be monetary?


Who would they throw in jail? Tim Cook? Continuing on down the line until Apple has 0 employees?


At some point Apple will act to protect their own employees...


I am not sure how realistic this is, but couldn't they move relevant parts of their business overseas? E.g., hardware and software development still happens in the US, but updates are vetted and signed by a separate entity in a privacy-friendly country?

The US could block iPhone sales, though I think the probability of that happening is extremely low, since it would have a major economic impact.


Trump will now have to boycott Apple, Google, FB and Microsoft.


Finally! Somebody rooting for Free Software!


Trump/RMS 2016!


Is it even possible to tweet without touching products from any of the above companies?

I guess Trump will have to buy one of the discontinued BB10 devices...


Hey, our prime minister (Mark Rutte) still uses a Nokia from the pre-smartphone age.


If only Twitter joined them.


They were the first to show strong support: https://twitter.com/jack/status/700457149227360256


The tech industry seems to be a bit like Hollywood in how it supports the military industrial complex - the pendulum swings between 'pro-totalitarianism' (where everyone signs up the the Patriot Act of the day without hesitation) and something more 'pro-democratic' (when a Snowden type situation comes along and freedom stuff gains traction).

In a way we are lucky that money talks in the U.S.A., it allows for this dynamic. If too much 'totalitarianism' hurts the bottom line then things get pushed back a bit and the noose loosens a bit. All the government needs is another big polarizing incident and I am sure Microsoft will be back doing as much of the government's dirty work as possible.


> In a way we are lucky that money talks in the U.S.A.

Money talks, but votes are king. When we collectively vote a politician out of office, his or her money dries up. In a way, votes are made more powerful by money.

Check out This American Life's episode on "Take the Money and Run for Office" [1]

Dick Durbin: I think most Americans would be shocked-- not surprised, but shocked-- if they knew how much time a United States senator spends raising money. And how much time we spend talking about raising money, and thinking about raising money, and planning to raise money. And, you know, going off on little retreats and conjuring up new ideas on how to raise money. [2]

...

Barney Frank: If the voters have a position, the votes will kick money's rear end any time. I've never met a politician-- I've been in the legislative bodies for 40 years now-- who, choosing between a significant opinion in his or her district and a number of campaign contributors, doesn't go with the district. [2]

[1] http://www.thisamericanlife.org/radio-archives/episode/461/t... [2] http://www.thisamericanlife.org/radio-archives/episode/461/t...


I don't think it as much "money" as it is "power". "money" unfortunately seems to have no opinion on this case. If it did, we see more wealthy people chiming in. However, Apple is a very powerful player - not only in the tech community but in the global cultural context. They rarely wield this power in that larger context. I am glad they are doing so in this case.

Microsoft is a bit too late the user security game. Since they don't make their own CPUs, they can't play in the same league.


So there's an encrypted blob. The way you input a password has nothing to do with that blob or the encryption that was used to create it.

The argument that this would be some kind of cryptographic back door seems a little specious to me, and I'm usually the biggest edgelord imaginable when it comes to decrying government surveillance.

If we were talking about a safe somewhere, and the government asked the safe manufacturer to create a physical device that makes it easier to open existing safes, that would not seem reasonable to me, but it also would be ludicrous to claim that represents a "back door." If that's a back door, then the back door already existed.

So in Apple's case, the government says "we need the data out of this device" and that's apparently a solvable problem. Can someone please remind me why it's so cut and dry for Apple to just say "no" and that's supposed to be OK?


1) It's a backdoor because it's software, not a physical device like your example. They are not asking them to make a robot finger to push the buttons. It's not only more trivial to unlock the device, it's significantly more likely the software is copied and leaked into malicious hands.

2) The outrage, in my opinion, is much more about your exact example -- that Apple is being compelled to develop this software themselves. That they are being legally required to weaken the value of their own product.


> 1) It's a backdoor because it's software, not a physical device like your example. They are not asking them to make a robot finger to push the buttons. It's not only more trivial to unlock the device, it's significantly more likely the software is copied and leaked into malicious hands.

If I wanted to get into my neighbors house, and I had some way to compel a construction crew to come install a new door in that house so I can walk through it, that would not mean houses now have back doors. That would be a very disingenuous way of putting it.

"But what if the software is copied" is the only argument that holds any weight, and I don't find it very convincing. If the "copyability" of sensitive software/data is such a huge concern, then we should probably start asking how we intend to survive as a species.

> 2) The outrage, in my opinion, is much more about your exact example -- that Apple is being compelled to develop this software themselves. That they are being legally required to weaken the value of their own product.

Perhaps, but that's not what Tim Cook is saying.


Did you read their motion?[1] It is very much part of their argument. It sets a precedent that not only will other US courts compel them to unlock devices, but other countries will as well. Even if you trust the US government to not use the software maliciously, do you trust other governments?

1) https://www.documentcloud.org/documents/2722199-5-15-MJ-0045...


What's stopping other governments from compelling then to unlock phones now, given they've effectively said they can.

What's stopping rogue employees from selling that capability - is Apple Computer's internal security good enough to see off espionage by nation states?


They've already asked and Apple has had a strong bargaining position.

"Beijing last year backed off on some proposals that would have required foreign companies to provide encryption keys for devices sold in the country after facing pressure from foreign trade groups."

http://www.nytimes.com/2016/02/21/technology/apple-sees-valu...

If Apple were to allow this for the US government other governments will want the same.


> If we were talking about a safe somewhere

This has been discussed. See https://news.ycombinator.com/item?id=11178475


Exactly. The backdoor already exists - what Apple are being asked to create is a specific exploit.


Google has been awfully silent about the whole thing, how secure are Android phones (Nexus phones in particular) against a similar attack?

I'm wondering if they are so easy to crack that this hasn't been a problem for Google. Either that, or their phones are just as secure as iPhones, and they caved to FBI demands immediately.


I found this surprising as well. There is some information in this article:

https://itsecuritything.com/google-nexus-6p-security-teardow...

To be honest, I think for non-Nexus phones, the situation is pretty bad. Even if a device vendor would not cooperate, security updates are so glacial (if they happen at all) that a state actor probably won't have much trouble finding attack vectors.

(I am pretty frustrated by the lack of updates for my Moto X. Motorola used to be pretty good at is. Now I have Marshmallow, but they didn't roll out any security updates beyond November. It seems that they just want to tick off the Marshmallow box and that's it.)


Definitely tepid, only visible response from Sundar's tweets: https://twitter.com/sundarpichai/status/700104298600886272


One thing that I can't help thinking about in all of this... would Apple have contested this case had the Snowden leaks never occurred?


Yes. But it wouldn't have been in the public sphere. It would have happened behind closed doors like what happened to Yahoo in 2008 (which we didn't find out about until 2014). And in the secret court, my guess is Apple loses. Law enforcement would get what it wants because privacy advocates only seem to exist in the public.

Although many are saying this will be decided in court, the public opinion matters a lot here. We sway action in government through our votes. Thankfully this time we have a say in the matter.


No quote from Apple itself, of course, but:

Former CIA Agent Says Edward Snowden Revelations Emboldened Apple to Push Back Against FBI: http://www.democracynow.org/2016/2/25/former_cia_agent_says_...


I have a question for the lawyers on HN. Why doesn't the judge in the case hold Apple in contempt of court until they modify the OS, in the same way they would hold a reporter in contempt until they revealed a source?

In both cases, you have a third party who is being compelled to assist an investigation against their will. My only guess is that there is a precedent set for reporters, but not for computer programmers.


IANAL, that said, at the end of the order, it says

"To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make application to this Court for relief within five business days of receipt of the Order."

Since Apple has presented such an application, they are now supposed to return to court next month to argue their position.


I think that doesn't speak to the expansiveness of Apple's response. It's my understanding that the burden to undertake this effort being excessive is only one of their grounds for argument. They are additionally arguing that it violates free speech principles, and also that the request goes beyond their initial compliance efforts to aid in the investigation to fringing on violating Fifth Amendment principles by asking them to go above and beyond the simple compliance.

To my untrained eyes, they're basically saying that instead of having a key and being asked to turn over the key, and doing so - they're being asked to devote efforts to create an entirely new key that would unlock many or all doors. Which could then be used in many other unrelated "need to open the door" situations.


It's an ex parte order -- meaning, it was issued to the government without anyone from Apple present to argue against it, so instead Apple gets a chance to respond arguing against it after the fact.


I think at best, the judge could hold corporate officers in contempt, not any individual computer programmer or developer. However, given Apple's stance, they would likely respond to the contempt order in a similar way they are now: accrue the fines, don't pay them, appeal as high as it takes to have this ruling overturned.

I'm not a lawyer, at all. But my guess would be that if Apple reached a point in court where the original ruling was overturned, the contempt penalties would be invalidated.


How far could the FBI use this case as a precedent if it goes through? Could they demand to get stuff signed and put on Canonicals repos? Could they demand to get a MS-signed EFI?

Or could this case just be used as a precedent in cases that clearly only involved one (1) specific device?


Nobody knows. That would be for the unelected judge to decide, which is what makes this all so scary.


Much to the FBI's chagrin, the Whitehouse agrees with Apple, which is a great sign that someone in this country is sane:

http://www.nytimes.com/2015/10/11/us/politics/obama-wont-see...


I don't think that's accurate to say. The White House Press Secretary Josh Earnest said in his briefing a day or two ago that the administration backs Comey's efforts. US attorney general Loretta Lynch said she backs Comey too. Obama is the only one who hasn't made a public statement about this issue.

Chain of command, by the way, is Obama > Lynch > Comey. They're certainly in communication and agreement about this issue.

You might think Obama would feel differently given his rebuke of China last year for attempting to set the same kind of precedent. But no, he fails to see the similarity. It's scary. The highest ranking official who understands this issue is Ted Lieu.


> The highest ranking official who understands this issue is Ted Lieu.

You mean the highest ranking official who agrees with us. It's easy to see how, if you trust the US judicial system, you'd be okay with this. If you think warrants are only given in valid and justified situations, this wouldn't bother you, because this'd only be used for "good".

We know better, but that's probably not the position the head of one of the branches of government (and a scholar of the highest law of the judicial branch, even) finds himself in. Obama probably very much trusts the US justice system.


> You mean the highest ranking official who agrees with us

Yup that's what I meant.

> Obama probably very much trusts the US justice system

Hmm, Obama certainly wants Apple to comply here. However, he has no power to sway the justices.

Obama didn't want to go to congress for a new law surrounding this issue, and seems to have directed Comey to use the AWA. Obama hasn't said why he didn't go to congress. It seems like he wants to set precedent while acting like it's only for this one phone. Pretty sneaky if that's the case.

Personally I trust our right to express our views both in speech and through our votes. I trust that elected officials can represent those of us who voted and wish to be represented. The problem is we have not yet elected enough people who understand computers. And, that's not entirely our fault. It's a new field and there are not many people with exposure to it who can represent our views.


I (, in a non facetious manner) honestly don't get this. Aren't all of the above listed giants the ones that are going out of their way to track, spam and use our personal lives as objects of advertisement in the first place?

How or why are they petitioning for the government to not be able to do what they're already doing? Isn't that completely ironic? Is this a case of "it's okay if /WE/ steal your data, but we don't want to government too (so that only we have total access to it)"?

Someone more enlightened than me, please help me out.


It's quite simple.

Google, Facebook, Microsoft, Apple, et al. do not have the power to arrest you, and never will. They can't find out you said or did something in the past and put you in a gulag for it 10 or 20 years from now when you're considered a political enemy of the state.

It's the critical difference between economic and political power. Google has no guns, much less the all-powerful legal authority - and then some - possessed by the government.

Microsoft can sell you software. Amazon can sell you a blender or ebook. Apple wants to sell you a phone. The US Government can kill you, destroy your life, put you on a no-fly list, proclaim you a terrorist to be constantly surveilled to the highest degree, punish you for speaking out, sic the IRS on you, blackmail you by tracking every single thing you do digitally without any consequences to themselves and then use it against you at their convenience or for their benefit (Google execs would go to prison for that), and dozens of other things using their countless agencies (and if you happen to be a leaker or journalist, the context is that much more amplified). And that's just what they can do to you today, the fascism in America is blatantly going to keep getting worse, they keep reaching for more and more power. Tomorrow, the things you do today, will be held against you. There isn't an example of a fascist system in which that hasn't been the case.

Google et al. are the absolute least of your worries. The free market profit motive is predictable and functions at its most efficient under systems of high degrees of freedom (thus free market). Increasing political power of the sort going on in the US however is always violent and always trends toward an ultimate restriction of liberty. One need understand only the very basics of history to grasp that.

A quick look at what governments have done over just the last 200 years, versus what corporations have done, tells you everything you need to know. It isn't a close comparison. Show me one modern big corporation like Google that has done the kind of evil things that eg Mugabe in Zimbabwe has done for example - there's one simple example, and hardly the worst I could reference (how about Pol Pot?), and it demonstrates how clearly absurd it is to be frightened by the 'big bad corporations.' The notion we need to be afraid of corporations is almost entirely a myth, typically pushed by people that then turn an intentional blind eye to the endless murder, war and abuse by governments. It's power-seeking governments you should be absolutely terrified of.


The whole Apple/FBI fracas is about promoting smartphones as the centre of our universe: the confluence of medical, financial, and all other personal information. If Apple can convince us that smartphones are secure, there are lucrative opportunities for smartphone technology to become ubiquitous and pervasive in our lives.

Unfortunately, it will always be necessary for law enforcement to have access to bad guys' stuff. The Fourth Amendment guarantees your privacy, but police can break down your door and seize your financial and medical records if they have reasonable grounds to suspect you. Why should your smartphone be different?

The smartphone is not an inalienable right. We are confusing convenience with fundamental rights; technology with entitlement.

If you don't trust your smartphone to be impregnable, then guess what? Don't put your medical & financial info on your smartphone. There was an epoch when we actually did banking & medicine without smartphones.

Apple, unlock the terrorist's iPhone, and resign yourself to a reduction in sales.


> The Fourth Amendment guarantees your privacy, but police can break down your door and seize your financial and medical records if they have reasonable grounds to suspect you. Why should your smartphone be different?

This is a false dichotomy. Simply because technology creates a situation where governments can't access a users data without the user consent. Apple's technology hadn't reached this point in this case (aka end-to-end encryption), which is why we're even having this conversation but we've arguably passed that point as an industry where that will be the future legal environment.

It's entirely plausible Apple could create an iPhone that they can't unlock, or iMessages they can't read. So then it's no longer about coercing middlemen but the US gov vs user consent. Outside of self-incrimination this changes the legal question to be about unlocking every persons medical/financial records, not about unlocking a single persons.

The only path the government has is to coerce Apple into making backdoors or purposefully weakening their encryption for all devices which affects every Americans fourth amendment rights - as well as public saftey.

Therefore this is not just about one person in a criminal trial - since a backdoor can never be made only for a single court case, it will by nature unlock the phones to any party who can create or get access to the backdoor.

So the only legal path for the government is to either coerce suspects into self-incrimination by forcing them to unlock their phones or prevent Apple/Google/etc customers from being able to meaningfully lock their phones in the first place. The problem with the latter is that criminals/terrorists won't be forced to use the backdoored Apple/Google/etc technology but can use open-source versions with encryption to side-step law enforcement's efforts - making it ultimately ineffective as a legal strategy.


> The Fourth Amendment guarantees your privacy, but police can break down your door and seize your financial and medical records if they have reasonable grounds to suspect you. Why should your smartphone be different?

One problem with this analogy is that the police don't mandate that everyone's door be easy to break down and thereby make all houses easier to burglarize. Another is that you know when someone has broken down your door.

If phones must have back doors, those back doors will become known to others besides the US Government. Imagine if every time you traveled abroad, you knew that airport security could take your phone and get all the data on it, or maybe add malware.

> If you don't trust your smartphone to be impregnable, then guess what? Don't put your medical & financial info on your smartphone. There was an epoch when we actually did banking & medicine without smartphones.

You can say the same thing about laptops, or even paper files. "You're not allowed to take effective security measures to protect your info because we might want it" is a bad policy.


> but police can break down your door and seize your financial and medical records if they have reasonable grounds to suspect you.

But this isn't the police's right, it's our right(s) that we're voluntarily ceding to the government such that it can practically provide us our basic security.

> Why should your smartphone be different?

Because it is different. I can't backup a car, or encrypt one, etc. Traditional possessions didn't record your GPS location and didn't contain all your private and banking info and couldn't be turned into remote listening devices and so forth.

So far we're seeing traditional law enforcement having a pretty good record against even high-tech criminals. They caught DPR with legwork and correlating multiple tails to figure out who was who. Even though he took extreme measures most of the time a few mistakes took him down.

So, with the thought in mind that nobody fights harder than for their budget, we should have discussions with our security forces and reevaluate our needs; see just how much of the physical world's laws usefully translate and what needs to be revamped. Evaluate what we can reasonably hope to achieve at what cost.

The laws don't own us, we own them. The discussion needs to be "What should the laws say?".


No one builds doors to make them especially vulnerable to the police. Why should phones be any different?


It's still a muddle, and the tendentious Bloomberg headline does not help. Microsoft's position is extremely cautious, but they had to side with Apple because opening the "all writs" can of worms to compel software creation and signing binaries is going to jack up law enforcement compliance costs to unlimited levels. Gates, in the quote in the article, is also, still, trying to have it both ways: “The extreme view that government always gets everything, nobody supports that. Having the government be blind, people don’t support that.” Evidently he thinks conditional privacy is sufficient and that the issue can be separated from encryption bans and an "all writs" model that compels new things to be created at the whim of FBI agents.


This just makes the Gates story even stranger. Did the FT misquote him because they're incompetent, or do they have an ulterior motive at play?


It was stated at the end of the Bloomberg article that Bill Gates' position on the issue was misreported:

Earlier this week, Microsoft co-founder Bill Gates told Bloomberg Television he was “disappointed” by reports that he supports the U.S. government in this dispute, saying it doesn’t accurately reflect his opinion.

“That doesn’t state my view on this,” he said in an interview on “Bloomberg Go.” “The extreme view that government always gets everything, nobody supports that. Having the government be blind, people don’t support that.”


Note that the article doesn't actually go on to clarify or accurately state what his opinion is.



Sure, but then he's saying he was misrepresented by that?

So do the quoted questions and answers written on that page accurately represent what Gates said, because if they do, then although he might not be 'siding with the government', it seems he sees nothing problematic with the government making this sort of request, either in compelling Apple to produce a version of iOS that will allow for brute-forcing, or in this case setting any sort of precedent for what the government can ask companies to do via the All Writs Act.

Actually, if the quotes in that article are accurate, it seems that Gates doesn't understand the issue at all, specifically:

> Apple has access to the information. They’re just refusing to provide the access and the courts will tell them whether to provide the access or not

Which is not the case here. Apple doesn't have access to the information and they will need to build a customised and security-compromised version of iOS in order to allow the FBI to attempt to retrieve that information.


I thought it was an alleged terrorist's phone? Or has innocent until proven guilty gone out the window?


"Innocent until proven guilty" only applies in a criminal trial. The man in question died in a shootout with police; he will never be proven guilty, because you can't put a dead man on trial. You can't libel or defame a dead man, from the point of view of the law.

There has to be a point where media can call someone a terrorist short of conviction in a court of law. You can argue that that point hasn't been reached in this case, but there has to be a point, since otherwise we'd have to call Bin Laden an "alleged terrorist."


As a very general comment as I do not know anything about the San Bernardino case, I think we need to apply extreme caution when using the word "terrorist" because of extreme, far-reaching legislation based on the broad definition of terrorism: https://en.wikipedia.org/wiki/Terrorism_Acts#List_of_legisla...


Good point. "terrorist" is almost always misused anyway, since no-one fears to be blown up tomorrow in the US. The only thing to fear is the power of the government, its officers and the low-level education of their thugs. The thugs with the law on their side are the ones who currently reign with terror, not like terrorists, but like Nazi law enforcement officers in 1936. Wow. I might have escalated quickly, but I'll leave it here to see what you people think.


Terrorists cannot be innocent. /s



This is all great but one thing bothers me: they still apply to some government requests to give some data [1].

I know that adding a backdoor is different, but why not also protect for example emails on a server?

http://www.apple.com/privacy/government-information-requests...


Let's see, what's worse...

Giving access to data in supposedly limited cases to an authority that's supposed to do it for the protection of the people, or;

sell the data to numerous third parties, who further make money of the data, while there's absolutely no assurance the data isn't bought by the aforementioned authority anyway?


Aren't Microsoft working with NSA? is this just a publicity stunt?


Are you implying that Apple, Facebook, and Google aren't?


The skeptic in my wants to say they all are, but everyone is praising Apple for some reason


Apple has to protect customer privacy. I oppose Apple to give any information to the FBI.


Hey look, they took the fear-mongering word "terrorist" out of the headline.

There's hope after all.


Plot twist: US companies and the US gov play a game whose only goal is to reestablish lost trust since Snowden to make more $$$ again and sell more hardware with US backdoors again.


Notice how the headline makes sure to specify "over terrorist's phone" and not "over user privacy" or "against federal conscription by the FBI."

Take a second and re-read the headline. What does it say? To me, it spells out Microsoft joins Apple to back terrorist's privacy against the FBI.

The government couldn't have chosen a better case to publicize in search of a precedent in their favor. And the media isn't helping.

Edit: In case it wasn't clear, I think it's actually disingenuous to mention the word "terrorist" in the headline at all. Nothing about why Apple is resisting (which is the crux of this news cycle) has to do with the fact that they are, bizarrely, fighting for this (dead) user (who is undeniably a madman murderer and potentially a terrorist)'s right to privacy.

Microsoft, Apple, Facebook, and everyone here in the comments isn't defending this person's right to privacy. It's 100% about the principle and the precedent and it's a million times about the future users and nothing to do with this horrible person. For that reason, it's unfair (in fact, you could call it purposeful misrepresentation) to say that they are against the FBI unlocking "terrorist's phone" or defending the "terrorist's privacy" in any way. What they are doing, if one excludes self-interest, is protecting the principle for everyone else out there.


I disagree. "Terrorist's Phone" is simply the most efficient way to identify the case. "User's Privacy" would mean absolutely nothing to someone who hasn't heard of the case as much as we have, especially considering most of Apple's marketing nowadays is through that lens…

> The government couldn't have chosen a better case to publicize in search of a precedent in their favor.

If you'll allow me to be pedantic, they could have; we can be pretty sure this phone has none of the evidence they want it to, and it's not like anyone at Apple will lose sleep feeling responsible for aiding terrorism.

But my point is, this is simply the headline that will get the most clicks. Any political bias in it is coincidental. (And of course, it seems reasonable for most people to agree that, in a vacuum, it would be OK for Apple to unlock the phone, as long as it never affected other users.)


I'm not sure that it's easy to dismiss political bias as coincidental given that the people publishing the article are professional wordsmiths who understand the impact of every word they choose.


There's an easier explanation.

"...around user privacy." Boring. No one will click that. "...against federal conscription." Sounds like a conspiracy rag. "...terrorist phone." Interesting! Let's click.


Reminds me of the rule "don't ascribe to malice what incompetence can explain."

Don't assume nefarious intent when there is a profit motive involved.


As someone that used to work as a "professional wordsmith," I think you vastly overestimate how much they care. Does it work? Do I hit my SEO metrics? Can I go to lunch now? And remember, headline writers often aren't the people who write the articles, and may not be people who have READ the articles.


Why not just "phone"? Terrorism shouldn't have anything to do with the case. I don't believe the fourth amendment makes a special provision for terrorists or other "scary" people. For a good reason—terrorism is a polemic term.


Like he said, for the clicks. It makes it seem like a more important case.

It's also more descriptive for people who may be hearing about the story for the first time. Personally, I call it the San Bernardino iPhone case.


"The Capitalists will sell us the rope with which we will hang them." - Lenin

Can't say I agree with him on much, but he had a point with that quote.


Actually, he probably never did say that.


How so?


Perhaps the very tactics currently sustaining the reporting industry will lead to its downfall.


If efficiency and pedantic(s) are going to be invoked, wouldn't it be easier to have the headline read, "unlock phones," since it turns out that Apple is currently dealing with multiple requests from GOV?


Also that this is an order under the All Writs Act and not a search warrant. If I had a nickel for every time this is described in the press as Apple resisting a "warrant" or a "subpoena" I'd be able to make a seven figure donation to the EFF!


The order was issued to effectuate a search warrant for the phone: https://www.documentcloud.org/documents/2714001-SB-Shooter-O....


Right, but Apple's not challenging the warrant, they're challenging the related order.


It wouldn't make any sense for Apple to challenge the warrant--the government isn't searching its property.

The media is reporting this correctly: they're noting that the government is acting pursuant to a warrant to clarify that they're not asking Apple to assist with a warrant-less search.


It was also ex parte; Apple wasn't even there to represent their interests in the dispute.


The All Writs Act is what gives the government the authority to issue a warrant. This is the tactic of the government to obscure what is really relevant by mentioning a law that nobody will recognize which is nothing more than the authority to issue a warrant.

A good detailed discussion on that subject is here https://www.youtube.com/watch?v=Cvh0YwpnPQo


I was about to post that that is disingenuous and that they are backing apple due to the case at hand, which a software person knows relates to the intrusion matter at large, but... the majority of the population only cares about headlines and political platitudes that they can obtain in 10 seconds or less. "TLDR" is not just a meme, but a sad reality.

[edit] As in some might interpret this to be: "Wow the headline says Microsoft supports Terrorism too!!!" instead of appreciating that the request has much farther reaching implications.


Those aren't neutral headlines either. The first presupposes that Apple's assistance with this specific phone will endanger all users' privacy. The second characterizes what is in most cases routine cooperation with legal process as "conscription."


Sure the case isn't perfect, but is it ever going to be, given that the government can choose it?

The details of this case, however, aren't too detrimental. Basically, a couple buys guns, goes next door and starts shooting. It's then a terrorism case, because... well their name sounds arabic and possibly the guy visited some internet pages.

For me, this case only demonstrates that if you make guns available for purchase in every supermarket, chances are that one day a mental person wakes up, buys one and just starts shooting. And there's really not very much you can do about that. But there isn't a smoking gun hidden here that would somehow make a strong emotional argument towards compelling Apple to decrypt the phone of a dead guy.


At least it is Apple the press is attacking, they can survive this.

Any other company would be torn to shreds by the purposeful fear-mongering the press does because of the clicks and eyeballs it generates.

I am "pleased" to see Apple's lawyers arguing against conscription though. It was exactly my thought the first time I heard about this and Apple being ordered to create something that didn't exist before, exclusively for the government. Kind of matches that 270 year old law the FBI is trying to use.


As long as laws are not absolute truth, and are written by misguided representatives, and representatives are intimidated by the possibility to lose their power, and fear can be generated through misinformation, when it comes to freedom of speech and thinking, we are all domestic terrorists by "definition" [1].

[1] https://www.fbi.gov/about-us/investigate/terrorism/terrorism...


Thanks for citing. You've mis-interpreted the cite. Here's what it says. Notice domestic terrorism requires all three characteristics, not just one, and the first is "Involve acts dangerous to human life...."

So you have to start by doing something dangerous to human life. Like cut down a tree, or give someone a glass of water, or sell them a ski pass.

"Domestic terrorism" means activities with the following three characteristics:

    Involve acts dangerous to human life that violate federal or state law;
    Appear intended (i) to intimidate or coerce a civilian population; (ii) to influence the policy of a government by intimidation or coercion; or (iii) to affect the conduct of a government by mass destruction, assassination. or kidnapping; and
    Occur primarily within the territorial jurisdiction of the U.S.


The case is very much a about a phone. The case has implications for other phones, but the most direct point is about hacking one specific terrorist's phone.


True. But there are other also true headlines that are more descriptive.

"MS backs Apple in debate over phone unlocking"


Bloomberg changed their title, so we changed it too.


Thank. I must admit I thought the original title a little inflammatory. The key point is the amicus brief supporting Apple, which in hindsight should have formed the title.


> I thought the original title a little inflammatory

In that case it's good to change it. From the guidelines: "Please use the original title, unless it is misleading or linkbait." Inflammatory is usually linkbait.

When changing a title to make it more accurate or neutral, it's best to use a subtitle or a representative phrase from the article.

https://news.ycombinator.com/newsguidelines.html


I clicked on comments to say the exactly same thing. I will not be surprised to learn later that FBI was arm twisting media houses against Apple.

I need to read Orwell's books again. They are like handbook for life in modern America.


To some extent. Some of us like to keep in mind that we've simultaneously realized Huxley's Brave New World in many respects--especially with the advent of smartphones, and the work we technologists do to keep people entertainingly distracted.


I concur. This is trivial to rewrite in more neutral language. Try this: "Microsoft Says It Backs Apple in Case Over Alleged Insurgent's Phone."


“Alleged insurgent” is a terrible description for “some crazy couple with guns”. They clearly weren’t part of any serious organized rebellion against the state (cf. any dictionary definition of “insurgent”), however much they had been inspired by ISIS or whatever.

How about just “San Bernardino shooter”?


What about a dead-mans switch app on any phone where if you dont type in the code within a time definable to you - the phone shall be wiped?


Destruction of evidence, a felony!

Except the person in question is already dead, so there isn't much punishment the government can mete out here.


A few reporters, and law folks, might need to re-read To Kill A Mockingbird and think on what they're doing.


Indeed. s/Terrorist/Nutjob/g


Would "Terrorist User's" have been better?


No it wouldn't be better. 'user' would be better. That's literally what OP wrote. OP would like terrorist not mentioned in the headline. Why isn't that obvious to you?


"MS backs Apple in debate over phone unlocking"


I feel like the same thing here is happening to the "Uber Driver" shooting incident.

The media is using it in a way that just makes the article more recognizable and clickbaity.


Well, "over terrorist's phone" is enough to tell me which issue it is referring to. "Over user privacy" or "against federal conscription by the FBI" would not, and honestly feel more editorialized.


"over phone unlocking"

Anyone that has seen the case would know what its talking about. Anyone that hasn't can read the article. "over phone unlocking" is a more descriptive title for the case.


Nothing but a PR stunt.


At least we've come full circle on PR. Capitalism does often respond to consumer wishes. I tend to think the money I spend is the biggest 'vote' I have in this country.

Besides, Apple was notably missing in the PRISM program during Jobs' reign. I'm not as negative on Apple as some other companies.


This reminds me of China condemning Russia's annexation of Crimea.

I wonder why is that.


Isn't Microsoft who built PRISM?


I predict that Apple will win big and that their victory will be very public and greatly celebrated. The purpose of this public display is to convince criminals to buy iPhones.

They think they are safe. That's the whole point.


Hmm I'm confused, so is FBI asking to unlock just this terrorist's phone or setup a backdoor to all iPhones? If they are asking to unlock only this terrorists phone I don't see why Apple should be fighting a case over this.


The FBI is asking Apple to build a tool to backdoor this particular iPhone. Once the tool is built, it gets much easier for them to ask Apple to use it on other phones. "We'll only ask for it this one time, honest" isn't a believable promise - Apple winds up with a much less defensible position later, since they already have the tools necessary.


"The FBI has asked Apple to crack at least 17 devices since October" http://mashable.com/2016/02/23/fbi-apple-requests/


Think of it more like going to a company that builds safes. Not asking them to blow the door open, but invent some magical safe breaking tool (that has never existed) and would then allow them to open whatever safe they want.

Nevermind the fact that the point the safe company success relied on trust that you cannot break into their safes in any way.


Couldn't Apple give the data of the terrorist's phone without creating a backdoor? I.e. making it an exceptional case rather than something that could easily be done again and again?

To be clear, I'm against the backdoor, but as an engineer I know that Apple could give the data back if they wanted. I guess it's bad PR from them to say so though.

I'm also very surprised at how transparent the FBI is about their incapacity at cracking the phone. Or maybe this is also just for PR reasons and the phone is already cracked?


> To be clear, I'm against the backdoor, but as an engineer I know that Apple could give the data back if they wanted.

What do you know? How are you suggesting they can extract encrypted data that requires a passcode to decrypt when no existing version of iOS allows electronic brute forcing?

Tim's already attested that they don't know any other way to access the data: http://abcnews.go.com/Technology/exclusive-apple-ceo-tim-coo...


What does it even mean "Cancer software"? Seems like PR bullshit to me. I've got a hard time believing that Apple can't crack their own phone.

To be clear, do they mean that THIS phone is impossible to crack? I guess the part I don't understand is how creating a backdoor for THIS phone makes it so that ALL users would get a backdoor..? Couldn't they push the software just to that phone for exceptional reasons? If the FBI wants to get other phones opened, they'd have to get a court order by the law and that would be handled on a case by case basis. Similarly to how the FBI could get a court order to come in my house if they suspect criminal activities?

(Edited for clarity)


> What does that even mean "Cancer software"? Seems like PR bullshit to me.

Tim's addressing the wider public (i.e. not tech professionals) and explains what he means in context, i.e. its bad software, something they would never write or should ever be forced to write.

> I've got a hard time believing that apple can't crack their own phone.

You've said you know they can and are now saying you don't believe that they can't. So how exactly should they decrypt encrypted data without the pass code it was encrypted with?

> do they mean that THIS phone is impossible to crack, but that would let other phones be crackable in the future?

No, it's been stated a number of times including being written in the Court Order that they can create a new version of iOS without software protections that allows the FBI to brute force the pass code electronically. This is the whole premise of the legal precedent, whether or not the FBI can compel Apple to write software that doesn't yet exist to create a new version of iOS with a backdoor that the FBI can exploit.

But they wont be able to do this with future iPhone's that are hardware protected by the secure enclave.


>> You've said you know they can and are now saying you don't believe that they can't. So how exactly should they decrypt encrypted data without the pass code it was encrypted with?

Sorry, I meant to say "As an engineer, I've got a hard time believing that Apple can't crack their own phone". Thanks for clarifying it up.


If you ask yourself if the contents of the phone could reasonably aid in saving other lives and bring about justice for those lost, then many would without a doubt say yes, make it public. Equally if it was some politicians emails then would they equally be happy for Apple to deny access being another question you could ask with many comming up different perspective of reply if you just ask the question about privacy.

Privacy is important but as with any rule, there has too be exceptions and I would say clear lives at stake would be one case.

But equaly the way things work and setting precidence, I can totally understand Apples and those supporting it's stance as the way things work in law is once the door is open, its a lot harder to close it. If better prevision without blanket abuse and a case by case basis could be approved and not set some blanket level of privacy abuse then again many would be happier.

Alas it is the way the government has approached this in that they would be in effect given a way that could be used with this but any other such devices at no need to even speak with Apple that is the real crux I feel.

We agree terroisim and indeed impacting others life spans is not the types of crimes we want hiding behind encryption or law, be that individuals or goverments themselves. But in a World in which such terroisim definitions and associated laws have and are being used on a scale and definition that trancends what a terroist is, then the element of suspicion and mistrust on the use of the term to jsutify any law or action comes more and more under public scrutiny and concerns of how it will be abused more than it should.

So an area that is very much dammed if you do and potential dammed if you do no equally. But for now Apple as others equally say are making the right moves, albeit II hope a fair and acceptable compromise is established and no blanket precedence that does more harm than good goes thru.


Wow, so brilliant to have Bill Gates so he's pro-government intrusion and then the company back Apple.

Update:

This was not meant to troll. I fell for the mis-attributed quotes to Bill and thought he actually backed the FBI. I stand corrected.


It looks like you're probably just trying to troll, but to head off additional discussion: Gates has some stuff mis-attributed to him and the very next day after initial reports, Gates disputed reports that he backs the FBI in the Apple dispute:

http://www.bloomberg.com/news/videos/2016-02-23/gates-disput...

A better troll might've pointed out the disparity between Microsoft supporting Apple's stances and the circumstantial evidence that seems to indicate that Microsoft actively worked with the NSA to backdoor its services (Outlook.com, Skype, etc)


Gates disputed those reports, but he never states he doesn't support the FBI in this case.

I think the key take away from that video is:

> Gates: With the right safe guards in place there are cases where the government, on our behalf [...]

He also dodges the direct question about his position entirely.

He's not trying to make his position very clear, but he is certainly angling to support the FBI's position. Lots of references to terrorism in that interview.

Edit: from a PR perspective, this does make a lot of sense for MS: use a board member/someone closely associated with them to probe the public's position, then give the public some assurance to ensure they don't all leave for Apple, while at the same time allowing the state to see that MS's support of Apple in this case isn't concrete.


Maybe Microsoft would like the precedent set in favor of Apple so they can use it against overly intrusive requests for back doors in their products in the future. They may have acquiesced but that doesn't mean they liked it.


Also Gates consults with Microsoft occasionally and is a large stakeholder but that does not mean Gates speaks for Microsoft, or that Microsoft speaks for Gates. In fact, they disagree pretty regularly.


He's on the board of directors, which is a bit more involvement than "consulting occasionally." I remember reading that he spends 1-2 days a week working on Microsoft business. That was around the time when Ballmer was leaving and Satya Nadella was taking his place so it may have been a particularly busy time.


I think that was more for the CEO transition stage than a permanent thing but I could be wrong.


He's a member of their board of directors and for a time was even the head of it. The board has more control than even the CEO so it's safe to say Gates is speaking for the company.


Was Marc Andreesen speaking on behalf of Facebook when he made those remarks about India the other day? No. Neither was Bill Gates speaking for Microsoft.


While it's your freedom to hold an opinion that is contrary to factual reality, I'm not sure how useful it is. What does it help you to misapprehend that Bill Gates is a spokesman for Microsoft?


More power, less control


As a founder and member of the BoD, he's likely on-par with Nadella on overall influence over Microsoft's strategy.

Don't downplay Gates' influence.


I think they're only backing it now because of the backlash against Gates. They sure took their sweet time, either way.


>They sure took their sweet time, either way.

Did they? Do we need to know every company's opinion on every controversial issue related to their business immediately?


Large enterprises are also not startups, they don't get to have the CEO publish a blog post within an hour of news like this. PR departments are careful to craft messages and narratives, shareholder value needs to be protected, and legal bases need to be covered.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: