Hacker News new | past | comments | ask | show | jobs | submit login
Apple to Close iPhone Security Hole That Police Use to Crack Devices (nytimes.com)
297 points by aaronbrethorst on June 13, 2018 | hide | past | favorite | 232 comments



It's frustrating that you never hear the pro-unlocking scene actually looking past their own noses. Like back when you could bypass a linux login screen by pressing backspace 28 times. Obviously that's a problem, but could you imagine if police departments all over the country got pissed when that bug was fixed and started complaining that they couldn't use the exploit themselves? Like they're the only ones smart enough to use the exploit and nobody else would ever do such a thing for malicious reasons. It's so short sighted.


I just looked this up and to save others the time, this was an integer underflow bug in the Grub2 bootloader's password protection[0], not Linux itself. Cool example, though.

[0] http://hmarco.org/bugs/CVE-2015-8370-Grub2-authentication-by...


C'mon. If it's part of a Linux distribution it's part of "Linux".


That's misleading even if you're not a follower of Stallman. Lots of operating systems use Grub that aren't Linux-based.


There's a substantial difference between grub password protection, a feature which I've never seen anyone use in practice and the ubiquitous login prompt.

The OP specifically said the login prompt was defeated by backspacing alone.


Reminds me of the bug in Solaris where you could get root by essentially pressing entering a username and mashing the keyboard a bit:

https://groups.google.com/forum/#!msg/muc.lists.bugtraq/5zYU...


Not really. You can use Syslinux in place of GRUB (in fact, I was when this bug was first appearing in the news). I also think it's bad practice to generalize things to that point. There is of course the obvious issue of Linux just referring to the kernel, making it hard to talk about just the kernel if you get into the bad habit of referring to many things by that same name. More importantly, though, if there's a bug in GNOME or your file manager, it feels very wrong for someone to say it's a problem with "Linux". I think this distinction matters more due to the modularity of GNU/Linux and all the choices of programs for basic tasks.


Tell that to Richard Stallman, who will go ape if you call GNU tools “Linux”


[flagged]


The “prick” isn’t necessary: you could just as easily have said that he can be pedantic and left it at that.


Exactly. Always apply the DRY principle.


He's an absolute visionary. Can you imagine how much you'd be paying to access a computer without him?


Last time I checked, you can still force single-user mode at bootup, and reset the root password.


Only if you don't use full-disk encryption, since the entire filesystem, including passwd/shadow, is editable.

In fact, the same principle can be used to reset and extract windows user passwords. Something I did many times as an IT support technician.


chntpw has saved my bacon so many times I can't count


Sure. But how prevalent is FDE?


FDE is the norm for desktop/laptop users, we're well beyond that in the GNU/Linux world.

The exception these days is leveraging secure boot and the tpm to ensure the kernel and initrd being booted and asking for your dmcrypt password can be trusted. That's our next challenge to make the standard.


Maybe it's just my experience but I haven't seen FDE be the norm within any of the companies I've worked for, and I've been in startup all the way to fortune 15...


I had an encryption requirement at both Oracle (for Windows) and Canonical (for Linux). For Windows they used some endpoint protection suite; for Canonical the functionality is built-in to the Ubuntu installer.

Also since I own a Mac as my primary laptop, I've just always used filevault there and it helps me sleep a lot at night. Means I am not concerned if it gets stolen, my derpy photos won't be in someone elses hand. I don't care so much about the hardware.


For Ubuntu, are you referring to LUKS or /home/user encryption?


If you want protection from someone booting up your computer in single/rescue mode to read/modify files you'd have to encrypt the entire disk (using LUKS, for example).


Doesn't matter, both of them are offered by the OS installer and are easy to set up.


FYI: only full disk encryption is available now in 18.04:

"The installer no longer offers the encrypted home option using ecryptfs-utils. It is recommended to use full-disk encryption instead for this release." https://wiki.ubuntu.com/BionicBeaver/ReleaseNotes#Other_base...


Are you even talking about GNU/Linux? Most of the comments in this thread have been about MacOS and Windows, and I made no claim regarding those.

FDE with LUKS/dmcrypt has been an out-of-box installer-supported mode in all the major distros for a long time now.


> FDE is the norm for desktop/laptop users, we're well beyond that in the GNU/Linux world.

Huh? The norm? For what sorts of users?

I mean, that's best practice, sure. But hardly the norm. Even, I bet, among HN users.



It's certainly not "on by default" for all Windows users, twice in the last month I've broken into machines for clients where users forgot their password, both times using nothing more than a USB stick (then replacing utilman.exe with cmd.exe)

Both boxes had fairly recent hardware and were running Windows 10 Pro.


So maybe those two users opted out of FDE, or had company requirements that told them to.


They didn't opt out of FDE and there are no company policies/requirements either.

I have three laptops on my desk right now, one HP Probook and two different Lenovo Ideapads. All three are running Windows 10 Enterprise (2x LTSB 2016, 1x 1803).

NONE of them have FDE enabled or have ever asked asked me about it.

Given the bold claim "[FDE is] on by default for (...) Windows users" - without mention of caveats re: login account types, domain memberships, or hardware requirements - it seems the counterexamples just keep coming.


It's only on by default in windows if your computer has a TPM, is instantGo compatible, and has a microsoft.com account. I don't think this combination is very prevalent.


It's the norm for GNU/Linux users who run it on the bare metal of their laptops, in my experience. I haven't seen a laptop boot a linux distro without a LUKS passphrase prompt in years.

I can't say the same for Windows and Mac users.

Last two employers recommended FDE but made no strict requirements because apparently it was non-trivial for non-Linux users.

It's a no-brainer feature for any sufficiently fast portable device. If the installer supports it why would you disable it? I haven't had to worry about the data on my laptops should they be stolen for what must be over a decade now, GNU/Linux has supported it that long.


OK, I'm impressed. But I think that we've been talking past each other. When you say it's the norm, you mean among your peers, right? Technical people. Start up, enterprise, etc.

Me, I've been talking about PC users, generally. Not just Linux, and not just technical people. Sadly enough, I doubt that FDE is very common, let alone the norm.

And even worse, for many it'd likely be a curse. Inexperienced people don't do well at keeping track of complex passphrases, keys and so on. I've seen that over the years in forums where people plead for help to access TrueCrypt volumes. After losing passphrases, accidentally formatting, and so on.


The context of this thread was Linux and being able to circumvent the boot process via something like init=/bin/sh in the boot parameters. Someone pointed out that strategy doesn't really work with FDE, which was challenged as being uncommon.

I am only speaking to the prevalence of FDE among Linux users; it's not uncommon.

You're correct in that these are technical people. Most people running a GNU/Linux distro on bare-metal in their possession are at least somewhat technical, wouldn't you agree?


I can't speak for others, but I always enable it on any installation I preform on my devices. Ubuntu is nice in that it makes it very trivial to enable during the installation process.


In old windows versions you could access user controls through the help menu and make yourself a new user to login.


An issue which wouldn't exist if Apple provided access for people with warrants.


The problem is, how do you build a back door that can only be used by people with valid (and just) warrants?

You really can’t go down your line of argument without answering thar


> The problem is, how do you build a back door that can only be used by people with valid (and just) warrants?

That's an interesting technical problem. A lot of people dismiss it as impossible out of hand, but they are underestimating what you can do with modern cryptography.

You make it a distributed back door that requires several independent third parties to cooperate to use it.

You choose the third parties so that no particular warrant seeker will be able to get enough power or influence over enough of the third parties to force them to enable the back door unless they think the use is valid and just. (You can actually design it so that the warrant seeker does not know who the third parties are).

Some aspects of this may be too complicated to be practical, but it is not impossible.


No, having worked on crypto I feel confident saying that the smart people, especially those who do understand crypto, know, and state, that it cannot be done.

All solution fundamentally boil down to someone has the keys required to decrypt all your content. This is ignoring the other technical costs (now your service has to record all ephemeral keys as well).

It doesn’t matter /who/ has that decryption key, the requirement is that access to the key is guarded only by policy.

You also resort to “with a warrant” but a NSL is effectively a warrant, and I’m fairly sure the DPRC has “warrants”. Who gets to request those keys? The position of the us government is that they have the right to access data from people in other countries, so can they provide a warrant to access that data? Can Germany ask for the data for someone in the US? What if someone travelled through another country - can that country now access those keys?

Note these all questions regarding /legitimate/ access, I’m completely ignoring police officers looking up people they stopped and stealing nudes, of people stealing the information and using that information to steal money, stalking, etc

I am sick and tired of people who say “modern cryptography is amazing, so it must be able to fulfill my unicorn dreams”. Either stop claiming nonsense or provide a prove that it is possible by designing a system that does what you say must be possible.


Briefly, use a secret sharing protocol to distribute the device key to 500 lawyers around the world from countries other than the country the device holder resides in, with 400 key shares needed to reconstruct the device key. A requester can get the device key if he can convince 400 of the 500 lawyers that his request is legitimate according to whatever the criteria was that the lawyers agreed to when they agreed to become key share holders.


Which layers? Do those lawyers have the competence to handle securing such a key? What happens when those keys are compromised? What is a legal order that would be valid? The US already has warrant exclusions for “emergencies”, how careful will these lawyers be when they’re getting thousands of requests a day? What happens when they get an NSL?

What happens when you’re investigating a criminal act sponsored by one of the governments you supposedly trust? What if they compel the lawyers in their countries to not provide the keys?

I generally dislike “what if” but given the problem space you do actually need to explain how your system works, and how it resolves these problems.

Remember there are plenty of countries with terrible human rights records, but they also still have regular crimes happen.


Your argument fell apart when you said "secret". What do you do to prevent widespread abuse when the secret gets out?


I'm sure that's a reference to Shamir's Secret Sharing Scheme,[0] which is hardly a secret.

0) http://point-at-infinity.org/ssss/


per-device key.


Who holds the per device key? If someone holds all the per device keys (or parts of the per device key) you've painted a bulls eye on their face. In all honesty, I cannot trust the US government with secrets like this because I don't see us being willing to secure it at the same level that we secure our nukes. And dear lord even that level of security likely won't be enough to fend off internal and external threats.


Lawyers? Gross. You didn't address what you do when the key is leaked, and then worse you propose to give decision power to scummy people. Not to mention that some of your worldwide lawyers will be sharia "lawyers" and such.

Then when people point out the problems, you don't answer. You make a lot of assertions that are flat out wrong.


What law would those lawyers apply and who would select them?


> It doesn’t matter /who/ has that decryption key

Of course it matters. Apple already has nearly total power to modify their devices, and are clearly acting with caution. American courts are not going to give a warrant so an officer can steal nudes. There is no a priori reason that your phone should require a greater kind of warrant than your home or your bank account, and wanting zero access at all only makes sense if you think law enforcement has negative utility.

Personally I'd do something more like this: Print one code on the inside of each device, so as to make access strictly harder than physical access, store one code internally as per your other mission-critical keys, distribute one key each to choice governments, inside a physical module so as to prevent clumsy storage and reduce the chance of cloning (since singly-owned keys can just be revoked if stolen), perhaps with some protocol so multiple members are needed to agree, if you are particularly paranoid. That way you can only get access if Apple wills (either given a warrant or otherwise coerced), the court wills (either a genuine case or simply corrupt), and you had physical access to the device. Apple could not then backdate court keys, so can only be coerced by China into providing access for future phones exactly as they are already in the position to do, though of course OS updates means they already effectively have a backdoor.


Except of course to you need the device passcode to install an update. So no Apple can’t just force an update.


But... why? I am not a criminal and no one but me should ever be able to access my device.


Because murderers, kidnappers, and murdered victims also use phones. The article includes commentary from an Indiana State Police task force which claims to have used information from warrants on phones to protect children in some manner.


So? Murderers and kidnappers will benefit from being able to access their victims phone - after all the phone was already weakened to support such access.

Also: murderers and kidnappers don’t have to use broken cryptography. So again the only people who are harmed are people who aren’t criminals.

Also the statement from the Indiana police is extremely vague, and doesn’t make any details available about what they were doing.


Knowing how our government works they would just get tired of leaving each other voicemails and put their private keys on a network share or something.


>several independent third parties to cooperate

Your foolishly conceived system does have any way to ensure that the several parties are not corrupt or evil. You would need to solve that problem first. Assuming it is solved does not work.

In the meantime, devices need to be designed to protect the human rights of users.


The same way a storage facility can give keys for specific units to people who present valid warrants to search those units.



You centralize ownership, the same way it has been done since the dawn of the internet. Even Intel can securely update your microcode.


The problem is the only ones you can really trust in this case are Apple, just like only Intel has the keys for microcode. Give out software to every police department instead and there'll be a hack out in the public the same week.


Yep, and private keys are never compromised.

Aside from the many, many known cases where they are: http://legacydirs.umiacs.umd.edu/~tdumitra/papers/CCS-2017.p...


As best as I can tell, if you don't trust OS or microcode updates, you can't trust your phone at all unless you airgap it or somesuch.


Very true. Everyone should know their TNO, Trust No One. As far as I can see it's the only way to really operate in the world of computer security that we live in.


Except, since the dawn of the internet, this has never really worked without issues.


I'm willing to bet that Apple can handle a single secure key in low-bandwidth interactions with law enforcement. This is not rocket science; it's not shared, it's not exposed to the web, it's not interacting with complex components. It's just a key.


The technical aspects aren't the problem. The political and social aspects are.

What does Apple do if they have this capability and China puts pressure on them to unencrypt some information on Chinese "dissidents"? China has incredible leverage over Apple, seeing as how all their devices are built there. Would Apple dare resist these requests from Chinese national law enforcement agencies?

Better to not have this ability at all, and thus never be put into this precarious situation.


Interesting that you looked to China foran example of a bad outcome. Ordinary police here in the states already misuse the privileged access they've been given, such as to supposedly private personal details to stalk romantic interests, personal enemies, business competitors, etc.[1]

1-https://apnews.com/699236946e3140659fff8a2362e16f43/ap-acros...


> Better to not have this ability at all, and thus never be put into this precarious situation.

Apple has this ability now, by installing an OS that allows bruteforcing the pin.


Apple cant do this passively/on demand, as upgrading the OS or Secure Enclave requires the user passcode. Of course the new OS/Secure Enclave firmware could remove this restriction, but generally these sorts of requests are reactionary


> The technical aspects aren't the problem. The political and social aspects are.

With this I agree completely.


Just to add. Looking at the current political state surrounding the Russian collusion investigation. It's not a stretch that Trump would not use this against his political enemies or "fake new media". And look at what his supporters have done with FBI agents that had an affair with private text messages. Trump isn't the first of his kind nor will s/he be the last.


Why would they ever want to?

Imagine the conversation if Veedrac was running the show at Apple.

Saudi Police: "We need the key to this user's phone because they are suspected of having a video of their own gay sex and we want to prove it and if they are guilty of gay sex, we will give the user 60 lashes and then execute them and their partner by chopping off their heads with a sword."

Veedrac: "Oh, OK, yes here you go, here is a single secure key on a one time basis so you can get the user's video and kill them."

Do you really think this is a great scenario, that Apple wants to actively enable by spending money and effort on their designs to make it happen?


Well I don't think I've ever been caricatured as homophobic before, that's one for the books.


The comment wasn’t saying that you are homophobic. Just that you were being foolishly naive in advocating a system that would enable evil regimes including, for example, homophobic ones, to more effectively practice some of their most egregious human rights violating behaviors.


And did you actually think about how such a risk might be handled, instead of (as I suspect you did) immediately jumping to strawman and slander the other side? It took me roughly 30 seconds to think of an approach that prevents overreach from foreign powers, I strongly suspect Apple can afford spending 5 minutes.


Foreign powers aren’t the only problem. Leaked keys, domestic powers, small town corrupt sheriffs, corrupt US Presidents, are also problems. You think you have managed in 30 seconds to overturn the thinking of the entire credible security community.

Let’s hear your approach then. Don’t be coy.


I have given a suggestion elsewhere in this thread. This will be my last post here, though.


The suggestion where others already poked holes in it and pointed out the obvious flaws for you... ok then Veedrac.


If only it were that simple, and Apple were the only ones to trust. History shows us that even the best intentioned key sharing is vulnerable.


Except that countries like Russia, China et al can demand that Apple hand over those codes. Much better that Apple is not in the business of being the gatekeeper for hundreds of millions of users.


They can already demand the keys to the kingdom.


It's a lot easier to say no if there are no keys to the kingdom.


Exactly.

And having more mechanisms by which they can access user data is not something that ever should be encouraged.


> if Apple provided access for people with warrants

Doesn't this face the TSA Master Key problem?


No, because Apple isn't incompetent, and is long-used to securely handling customer data.


The "TSA Problem" is not that they master-key manufacturer is incompetent...quite the contrary.


This is incorrect. I've personally experience Apple leaking private data. They do well, but they aren't infallible.



> access was later revealed to have been gained via targeted phishing attacks


https://www.ibtimes.co.uk/icloud-accounts-risk-brute-force-a...

"Last September, Apple said it had made changes to iCloud security and introduced a measure to stop software from making multiple automated guesses. While this is the case when trying to log in on a computer, unlimited guesses can be made using an iOS device, which is what this software pretends to be when accessing iCloud from a computer running it."


That could have just been someone simply nicking Harvey Weinstein's phone. I don't buy the "we were all hacked at the same time by scary unknown exploit" story.


Apple is not the owner of the devices, they just manufacture them.


Well then who is the owner? I certainly don't feel like I own my iPhone, I can't even run my own kernel on it!


Facts over feelings. I can’t (or should I say the bank can’t) install a kernel on my house... don’t they own it? Basing ownership on the ability to install a kernel seems like quite a few of my friends own nothing.



What sort of warrant process would Apple need to allow for with say, the Chinese government wanting to pull information out of a foreign national's device who is visiting the country? Or for that matter, is not visiting the country?

The problem is political, not technical. Apple can't be a fair intermediary for all the various governments and police departments of the world, so they use technology to cede control.


> and all the kids we can’t put into a position of safety

"For the children!" They can't think of a good reason to have access to all these phones so they blantantly use an idiom so tired that it's practically a joke.


I read that too, but it's quoted from a guy whose job is to protect children. The full quote:

> “If we go back to the situation where we again don’t have access, now we know directly all the evidence we’ve lost and all the kids we can’t put into a position of safety,” said Chuck Cohen, who leads an Indiana State Police task force on internet crimes against children.


>> all the evidence we’ve lost and all the kids we can’t put into a position of safety,

There are two type of child abuse imagery: the new and original stuff that points to a kid currently being abused, the one can can be rescued, and the enormous mass of old material that forensic investigators have seen literally thousands of time before. Actual new abuse material that could lead to the rescue of a child is thankfully very rare. While these phones could lead to arrests, the likelihood of them leading to the rescue of a child is negligible.

Once upon a time the bulk of images on phones were originals taken by the phone. Now "phones" are really just internet machines and the images they are looking for are essentially browsing history and stuff saved from online sources.


> While these phones could lead to arrests, the likelihood of them leading to the rescue of a child is negligible

I think you're ignoring some aspects of how the US criminal justice system works. Arresting a child abuser is one thing; they have to be tried in court and found guilty by a jury/judge. Criminal cases especially have a high requirements to prove guilt since they are very serious charges. So even if unlocking the phones may not help save actual children from abuse, I think what the guy means is that you can get more convictions for these child abusers who have been arrested and prevent future children from being abused.

Note that I'm not stating an opinion about the ethics of reducing security of phones, only pointing out what the person meant when he said that doing so prevents child abuse.


Then he would not have described moving children to places of safety. He would have said something more like preventing future abuse. Talk about physically moving children is rather dramatic and specific.


Well, while there is an enormous mass of old material - your claim that new material is very rare strikes me as odd.

The technology to create digital images and videos has gotten cheaper. Securely transmitting and storing media, and securely communicats regarding it has become more accessible too.

I'm not advocating for or against iphone unlocks (or the generic abstract backdoor), but I think it is dangerous to think that serious crimes of this nature are extremely rare, or somehow far less common than they were in the past.

On the flipside, if you are right - and identifying material that saves children from this kind of crime is indeed a rare event - then isn't a possible reason the ready encryption of such devices? Food for thought.


Child porn is "the new crack". It is everywhere. From celebs to teenagers to game devs, all sorts are busted every day. But when kids are rescued it generally makes the national news.


Remember, in general the police aren't tech savvy, and not very bright in many cases. They don't know that a tired trope like this merely identifies themselves as idiots.


Or rather, they know that a large number of people don’t see it as a trope and will be genuinely moved by it. In this bubble, not really, but a lot people aren’t so well educated or skeptical about motives.


Well to be fair to this specific instance, their department handles intern crimes involving children so their jobs are basically "for the children!"


I think the fact the person using that idiom works in the Indiana State Police task force on internet crimes against children kinda makes it relevant.


I highly recommend this 2016 paper by Stephanie K. Pell: "You Can’t Always Get What You Want: How Will Law Enforcement Get What it Needs in a Post-CALEA, Cybersecurity-Centric Encryption Era?".[0] She’s a former prosecutor from Florida, who now teaches at West Point.

She agrees with security experts that maintaining such lawful access, against pervasive "strong" encryption, would require the introduction of vulnerabilities, such as backdoors or key escrow. Which would expose users to malicious adversaries. She argues, basically, that law enforcement has become lazy.

She also raises the possibility of lawful hacking for smartphones, "infecting them with malware capable of capturing voice communications and keystrokes before they are encrypted." That brings to mind the FBI’s use of network investigative techniques. And of course, all those NSA tools.

0) https://scholarship.law.unc.edu/cgi/viewcontent.cgi?article=...


Doesn’t it seem strange that they’re not patching the actual exploit, only mitigating it? Do they have no idea what the actual bug is? Is the usb interface fundamentally insecure?


Yes! I'm very curious if they've managed to figure the actual vulnerability. It's possible that they have, but they're concerned about similar bugs leading to the same situation, so they're just sitting down the whole approach.


This approach mitigates the class of vulnerability, neutering the effect of this one and any similar future vulnerabilities.

This approach makes sense, since they do not know what this specific vulnerability is.


> since they do not know what this specific vulnerability is

How do you know this? I'd be shocked if they don't have one or more of these devices themselves and have it completely figured out.


No way would the makers of the device sell one to Apple - they probably have strict measures in place to only sell to police departments, with contracts in place to prevent re-selling.

Apple would have to buy one on the grey market, which they may be unprepared to do


> Apple would have to buy one on the grey market

No, they would have to pay someone to do "research" for them and figure out the vulnerability. They would pay that person enough to buy one on the grey market and figure out how it works, keeping their hands clean.


* after one hour of inactivity.

The vulnerability may still be present in the allowed timeframe. This could just lead the investigators to carry around portable cracking devices to use at the earliest moment they can. A kind of technology arms race?


> “They are blatantly protecting criminal activity, and only under the guise of privacy for their clients,”[Hillar Moore, Baton Rouge District Attorney] said.

I understand a Law Enforcement point of view of having access to private data in order to prosecute criminals. I disagree with that point of view, but I would never say that point of view is a guise to implement a surveillance state.


That quote jumped out at me too.

Like, does he actually believe Apple is using "privacy for their clients" as an excuse to accomplish their true goal of protecting criminal activity?


It sure looks that way. 'If you've got nothing to hide you've got nothing to fear' must be gospel for these people.


At what point does a state become a "surveillance state" in your opinion?


There isn't a single point for "surveillance state".


This article is written in a terribly slanted style bringing up constant adversarial comparisons involving apple and law enforcement. There is an objective point of view to the article but it bundles in a lot of quotes and references that are highly slanted.

I am not someone who assumes all NYT articles are slanted, but this one is bad.


Why do we only hear stories like this about Apple and the iPhone? How secure is Android? Is Google taking the same approach to protecting their users?


>Why do we only hear stories like this about Apple and the iPhone? How secure is Android?

When the story broke in 2016, only about 10% of Android phones were encrypted vs about 95% of iPhones.

https://arstechnica.com/gadgets/2016/03/why-are-so-few-andro...

At that time, Apple had included dedicated hardware in their SOC to handle device encryption for several generations, but enabling a software implementation of device encryption on Android caused performance penalties.

>there's a very significant performance penalty that comes with enabling FDE, with a 62.9% drop in random read performance, a 50.5% drop in random write performance, and a staggering 80.7% drop in sequential read performance.

https://www.anandtech.com/show/8725/encryption-and-storage-p...


Google is definitely starting to take security more seriously with the Pixel: https://www.blog.google/products/android-enterprise/how-pixe...


I’d argue they, as a company, always took security pretty seriously, but never privacy.

FDE was pretty late and performed slow on early Android, among other issues.


I would say Google has always taken the security on the Pixel very seriously. It's also the reason why the Pixel phone was unhackable at Pwn2Own 2017 while the Chinese teams had their way with the iPhone.


You hear about Android security problems all the time. There just isn't a single open (now fixed) flaw like this on Android to write about. There are surely specific hacks for individual devices, and those probably get covered, but not as "Android" things.


Apple is working into the AR space where privacy matters and they're working into that image for their brand.

The structure of Apple's revenue sources makes it a sure fire bet to do vs Google.


What about AR makes privacy matter? Apple has kept the same policy on privacy since way before it meddled in AR (and realistically almost no one uses AR, it’s photos/messages that people want private).


Using AR would mean everything you do is recorded to some extent even though every effort is made to secure it. For instance to make queries on what you're seeing.

If everything you do in your daily life is recorded that is a significant privacy issue. More so than your private photos/text messages and GPS coordinates alone.

Apple is working on AR glasses, and is significantly promoting AR SDKs with their newest iOS releases. It's very easy to see why they take this privacy stance today.


> The Indiana State Police said it unlocked 96 iPhones for various cases this year, each time with a warrant, using a $15,000 device it bought in March from a company called Grayshift

And what were the results? How many people did those 96 iPhones allow Indiana to bring charges against? In how many of those cases did Indiana prevail? And in how many of those was the evidence on the phone necessary?


These are the wrong questions to ask. There's no doubt that total surveillance would result in more crimes being solved, and more criminals being successfully prosecuted. It's not a question of whether the technique is effective enough that it should be allowed.

The question is can the government be trusted with a backdoor into our personal devices that "only they" can use? Should the people trust their government to only use that access lawfully, and can the people trust their government to protect that access from unlawful outsider access?

Since we've seen nothing but incontrovertible evidence, throughout history and to this day, that government cannot be trusted with this level of access to our personal devices (lives), then I can only hope that Apple and companies like it will fight to provide us with secure devices, and that our courts will protect our right to strong encryption to protect our personal data.


Actually, for practical purposes, your question is the wrong one. A lot of people think the more ideologically pure argument is the one to make, but it's usually the one with the least impact and least likely to bring about your desired situation.

"Right" and "Wrong" are thrown out the window as soon as law enforcement makes the case that there is simply no time for the constitution. They're saving little tiny babies! but "OK, how many little tiny babies are you actually saving as opposed to blatant overreaches into the lives of private citizens" is the more effective point. The numbers will look bad, they can't help themselves once they get a whiff of some new power.

In this case, arguing for the _idea_ of liberty (incorrectly) labels you a pedant more concerned with red tape than being a hero.


>they can't help themselves once they get a whiff of some new power

Semantics, but I don't think it is necessarily the "power", so much as it is a new vector to get the evidence they feel that they need to perform their job with less effort. It is a lazy method that circumvents the laws that stand in their way for good reasons.


> I don't think it is necessarily the "power", so much as it is a new vector to get the evidence they feel that they need to perform their job with less effort

I am inclined to agree with you, but I have no evidence. Hence the questions. HN is well aware of the risks. But I've never seen any work done into the potential benefits. If the benefits are slim, then the discussion is moot. If there are cases that can be solved with phone data only, and if those cases are horrible or frequent enough, then there is a valid debate at hand.


No laws are being circumvented if the police have a warrant. The problem (from a law enforcement perspective) is that technology now makes it easy for people to communicate and store data in a way that the police can’t monitor even with a warrant. This wasn’t an issue when the police could obtain warrants to monitor telephone lines and break into safes.


>"Right" and "Wrong" are thrown out the window as soon as law enforcement makes the case that there is simply no time for the constitution. They're saving little tiny babies! but "OK, how many little tiny babies are you actually saving as opposed to blatant overreaches into the lives of private citizens" is the more effective point.

This. Freedom isn't free.

"How many kids have to die before you give up your..."

"All of them you twat!"


> There's no doubt that total surveillance would result in more crimes being solved

I guess I'm doubting that. In any case, we're weighing costs and benefits in a limited context: access to a phone's data with a court order in hand. Framing an attempt to understand the benefits as "the wrong questions to ask" is counterproductive.


But he's saying that even if the answer to your concern is satisfying, there are other factors which outweigh that benefit and should be considered.


> there are other factors which outweigh that benefit

How can you weigh a cost against a benefit when you have no idea how big the benefit is?


I can't say in the particular case of surveillance.

One generalized answer to your question is:

"if the cost is unethical" then paying it really isn't on the table for ethical people.

So the cost of doing something unethical is infinite, assuming that you need to remain ethical, and any benefit is necessary smaller than the cost of losing your ethical status.

You might not agree that ethics are pragmatically important, but the point here is that it's possible to do a utilitarian calculus even if you don't know fully understand the possible upsides.


Can I ask how you're doubting this:

> Total surveillance would result in more crimes being solved

With perfect and total information how would fewer crimes be solved?


When you’re trying to find a needle in a haystack adding more hay isn’t the solution.


I expect the rationale is that once governments have total access to phones, criminals will stop using them, and governments will still have the access.


Total surveillance doesn't guarantee "perfect and total information".


Or simply, in free societies sometimes the bad guys get away with things. Or maybe they're not the bad guys and the state has deemed them "bad guys".

And what's really pathetic is that societies don't have discussions about unintended consequences of laws, and that prosecutors don't have checks-n-balances on their power/authority.


> Or maybe they're not the bad guys and the state has deemed them "bad guys".

Fun conversation topic. A large proportion of people cannot comprehend such a concept.


Even if it _were_ possible to only have the “government” have access to our personal devices and information, why would anyone want them to have that access? There are things I want to share only with my family, and ideally, I should have the ability to do that comfortably in this age of IoT. We won’t solve crime by allowing the government to spy on us; at best, it will cause the crime to be diverted or changed in form.


I see this line of thinking a lot in tech circles, but you really have to divorce this issue from the digital world. Imagine the exact same argument is about a safe instead of a phone. The government would simply request a warrant and then work on brute forcing their way into the safe.

If you have the same objections when using a safe, the answer to the problem has nothing to do with technology because you believer there is a fundamental flaw in the US criminal justice system. You aren't going to be able to consistently defeat the government by repeatedly trying to outpace them technologically. You have to instead change the laws that govern their actions.

If you think the rules should be different for a safe and a phone, you need to be able to explain why digital evidence should be treated differently than physical evidence?


The government is allowed to crack open a safe with a warrant. The government is allowed to crack open an iPhone with a warrant.

What does either of those things have to do with decrypting things found inside either the safe or the iPhone?

If the FBI found coded papers inside a safe, they could try to decrypt those papers, but couldn't compel the owner to assist them.

Security flaws are fair game for law enforcement. Secure encryption without exploitable weaknesses will probably defeat them even in the presence of a court order.


I consider the analogy to be does a manufacturer have a legal obligation to not make a safe that can't be broken into by the government?


Compare and contrast the different levels of effort required to gain access to a safe VS a persons iPhone prior to this data update.

Authourities must first gain access to premises with a warrant and then get a safe cracker in.

VS: authourities (at an airport or just about anywhere) take the phone off the person, there is a layer of obsfucation as to whether a warrant has been obtained or is needed, and data is collected.

If you look at the history of digital data (ie echelon etc) there is a clear trend for vast data hovering of all and any resources. This is not possible with physical data, as it leaves behind a more concrete trail when authorities go after it, and causes them to (generally) be more selective about what they are doing


Absolutely. Breaking a physical safe and breaking encryption don't differ in the _kind_ of hoop you're jumping, just the height you have to clear. But the encryption hoop is (almost) infinitely high, and that leads people to think about it as if it were another sort of thing entirely.


> The question is can the government be trusted with a backdoor into our personal devices that "only they" can use?

I think a better question is what happens when we go to a paperless society? Because it's coming, and search warrants will be useless in such a society.

And while some of your rhetoric about the abuses of government spying on citizens is justified (e.g. COINTELPRO, Watergate, etc.), search warrants as a whole do far more good than harm.

And I'd hate to see this power for prosecutors disappear just because drug dealers started using iPhones to keep their illegal activities hidden from police.


I think a better question is what happens when we go to a paperless society? Because it's coming, and search warrants will be useless in such a society.

Were this true, crimes would have been practically unsolvable before about the mid 1990s. Back then nobody carried pocket computers/communicators (encrypted or otherwise) and criminals also largely failed to make paper records of their crimes.

You can't smuggle guns via SMS. You can't vandalize shop windows over the Internet. You can't kill people with a digital photograph. Most crimes, and especially the violent crimes that scare people the most, require physical actions that leave physical traces.

The major exception would seem to be certain white collar crimes: insider trading, trade secret theft, perhaps tax evasion and other financial shenanigans. That's where I can see secure electronic devices making law enforcement significantly harder. But when LEOs speak against encryption in public they usually seem to invoke killers and kidnappers, as if victims' bodies were now regularly hidden inside encrypted iPhones instead of car trunks and shallow graves. Maybe LEOs believe their own rhetoric or maybe they just realize that "think of the victims of inside trading!" isn't the sort of terrifying scenario that will get the public on their side.


> You can't vandalize shop windows over the Internet. You can't massacre people at a concert [snip]. You can't smuggle weapons over a border [snip].

I could commit these crimes over the internet by using others to perform them. So the people you catch are not the prime cause of the crimes. This was not feasible before the internet, and digital monitoring would be the only way to catch me.


Even in the 20th century, heads of organized crime and terrorist groups generally weren't the ones actually smashing the windows of businesses that didn't pay protection money or blowing themselves up in crowded markets. If lower level criminals were guided by phone call, then law enforcement could get call records after the fact but not the actual contents of the past calls. If all communication was face-to-face, they wouldn't get even that. It seems about the same to me today. Phone service providers still have call records. PGP encryption on email bodies doesn't keep recipients secret. Apple can supply analogous iMessage metadata in response to a warrant.

Perhaps law enforcement is angry at the passing of a brief golden age where most Americans carried a cell phone and that phone's security was almost always terrible. I'd put that era roughly between 2000, when American cell phone ownership rates rose past 50%, and 2009, when Apple introduced iOS full disk encryption and began regularly improving other data-at-rest protections.


I don't mean to be a jerk here by constantly raising counter examples because I think you have some good points. But apps like Signal prevent all that data from being collected. It just stays on the phone, locked by the secure enclave.

> Perhaps law enforcement is angry...

I wonder if what they're truly feeling is worried. Would that change the way you thought about them?

I get that whole thing with "COINTELPRO", and Watergate where there were abuses. But assuming everyone is of the honest type here, wouldn't you want them to have more tools?


I wonder if what they're truly feeling is worried. Would that change the way you thought about them?

I want to see the evidence that forms the basis for their worry, if they're feeling worried. Sincere feeling is better than pretending, but sincere feeling isn't enough.

The most compelling evidence would show:

- A statistically significant decline in solving violent crimes or serious property crimes now over the 1990s.

- A statistically significant increase in solving these serious crimes among police departments with access to phone-unlocking (e.g. GrayKey devices) vs. otherwise-comparable police departments without access to such devices.

If nothing regarding smartphones shows a statistically significant correlation to serious crime solution rate, then everyone will just go with their gut feelings. The gut feeling of police is, apparently, that they need to be able to look at the contents of phones. My gut feeling is that they don't. It's an impasse.

If serious crimes are solved at a higher rate in departments with phone-unlocking capabilities, but the rate in the departments without unlocking tech is still no worse than in the 1990s, I still would resist weakening phone protections. It would make me think that maybe things have become harder since 2008, when phones were widespread and insecure, but that secure phones aren't a notable hazard to public safety.

If it turns out that departments with phone-unlocking tech are doing significantly better than departments without it, and that departments without it are now solving crimes of violence and serious property crimes at a lower rate than in the late 20th century, I'd have to have a long, hard think about my personal stance on this issue. There would still be gnarly technical issues even if I decided that less secure phones were a net benefit to society.

But I have changed my stance on security issues before. For example, I changed my mind about video surveillance of public streets. I find it more reassuring than dystopian. I'm more worried about hit-and-run drivers than about the cameras being used by a future dictatorship against the resistance. I've yet to be convinced that monitoring should extend to private spaces like my domicile or phone.


It’s not about honesty here, it’s about honesty in law enforcement/government.


> Most crimes, and especially the violent crimes that scare people the most, require physical actions that leave physical traces.

To borrow a page from your argument book, if physical evidence alone were enough, we wouldn't have so many unsolved homicides. Look at the KC unsolved murder list for 2017:

http://kcmo.gov/police/homicide-3/2017-unsolved-homicides/#....

> You can't smuggle guns via SMS.

Technically, you can send the plans for a plastic gun over the internet now, which just needs to be printed on the other side.

You can also send a virus, if the other side has a gene sequencer.

> You can't vandalize shop windows over the Internet.

You can vandalize digital signage over the internet. As well a steal credit cards from the store's credit card readers.

> You can't kill people with a digital photograph.

Well... technically it's animated gif. But yeah you can.

http://fortune.com/2017/03/22/twitter-epilepsy-gif/

> Maybe LEOs believe their own rhetoric or maybe they just realize that "think of the victims of inside trading!" isn't the sort of terrifying scenario that will get the public on their side.

Honestly, I think law enforcement believes that they need more tools. I don't think they want unsolved murders any more than the citizens do. So telling law enforcement officers, "you can pry open my iPhone from my cold dead hands" tends to ring hollow, especially if they really are of the honest kind and really do want to put criminals away as well as respecting your privacy.


>Honestly, I think law enforcement believes that they need more tools. I don't think they want unsolved murders any more than the citizens do.

The US Constitution makes it clear that the rights of the people to be secure in their persons and possessions is paramount, but for a narrowly defined list of exceptions. That crimes may go unsolved because the state is kept on too short a leash, or because modern technology has made their job more difficult, is a feature, not a bug. A government with perfect knowledge about its citizens behavior and correspondence, or one that can perfectly enforce its laws, is indistinguishable from tyranny.

>So telling law enforcement officers, "you can pry open my iPhone from my cold dead hands" tends to ring hollow, especially if they really are of the honest kind and really do want to put criminals away as well as respecting your privacy.

They can pry my iPhone from my cold dead hands, or my live warm hands will hand it over if they have a warrant, but it will remain encrypted either way, for no other reason than I have the right to do so, and they have no right to demand otherwise.


Is the murder clearance rate worse than it was in the late 20th century? Also worse than before iOS implemented full disk encryption? Your hyperbolic original claim was that search warrants will be useless in a paperless society. Had you made a mild and reasonable claim like "the contents of a phone, if available in unencrypted form, can help to solve crimes" I wouldn't have bothered to reply.

As for the rest, I can't tell if you are just trying to demonstrate that you can think creatively, or if you actually think that iOS disk encryption is an important element of a scheme to get away with murdering someone with a pathogenic virus recreated from sequence data. (Or to get away with 3D printing guns under illegal circumstances, or deliberately inducing an epileptic fit.)


> Is the murder clearance rate worse than it was in the late 20th century?

Why do you insist on this artificial restriction on whether or not police officers should have access to the data on an iPhone? It seems to me this would help the clearance rate if anything. It's another tool in the arsenal.

> As for the rest, I can't tell if you are just trying to demonstrate that you can think creatively...

The internet allows new styles of crimes, perhaps crimes you or I have never really thought about yet. Or maybe I've been paying attention to crimes on the internet lately. And for the record, I'm not the first to think about resequencing viruses. See point #5 here:

https://www.vox.com/2014/5/22/5739380/why-were-never-really-...

And to answer the rest of your comment, your rhetoric reads stronger than it truly is.


It's a high bar to show that encrypted devices enable a new era of unsolvable crimes. Isn't that what you were originally warning about, with the line about search warrants being useless in a paperless society? Such bold claims call for at least a bit of empirical evidence, like statistically significant solve rate declines vs. the immediate pre-mobile era.


I am amazed that you completely throw out the window the wisdom in preventive measures and simply wait for the crimes to physically happen then respond to them. It's a no-brainer that most of the extreme and mass violent crimes have been facilitated and coordinated digitally.


There were no iPhones in 2001 and 9/11 still happened. All the coordination was done by unsecure communications and face to face.

Having access to more worthless data only makes it harder to find the information amongst the sea of data.


Even if Apple could unlock iPhone storage given the phone and a valid warrant, law enforcement would still need to get the warrant and take the phone from the owner. Unlocking individual devices is not broadly useful as a preventive measure; it's something that could be done only after you have probable cause and don't mind alerting the target that law enforcement is interested in him.


> Back then nobody carried pocket computers/communicators (encrypted or otherwise) and criminals also largely failed to make paper records of their crimes.

But they did use fixed-line telephones and talk in person. These communications can be monitored with a warrant for the use of a telephone intercept or listening device.


> If we go back to the situation where we again don’t have access, now we know directly all the evidence we’ve lost and all the kids we can’t put into a position of safety.

And equally important, how many children were "put into a position of safety" by cracking into these phones?


I have nothing to hide. I don't intend to do illegal activity with my phone. I also do not want a government entity to be able to access my phone or device simply because they can. I am also very skeptical of any government entity that uses "because child molesters" as valid reason to shame a company for respecting privacy.


The most surprising part for me is that iPhones have been relatively easily hackable by having access to the data port. That doesn't seem in-line with the high security advertised. What about the inaccessible hsm and all that other jazz?


The company didn’t release the details of how their exploit works, but it is believed it is an automated brute force mechanism so it’s actually attempting to bypass security by trying passcodes over and over, not breaking encryption. This is another method to slow down brute force attempts.


The GreyKey exploit most likely involves resetting the iPhone's state in between passcode attempts, avoiding the exponential timeout that you would face. It's a flaw, to be sure, but not particularly "easy".

A bit more here: https://news.ycombinator.com/item?id=16829478


iPhones attempt to authenticate the port right now. A challenge system is in place if you connect an iPhone to a computer and want to authenticate it. The difference is that challenge is completely turned off with USB restricted mode. In other words, it's completely impossible for the challenge to start because the phone doesn't even let you talk to it while locked.


It took a while... I wonder if Grayshift have their next hole already lined up so that business can continue as usual...


If this fix disables the data port when the phone is locked, presumably all future zero-days will be blocked.


Naw, the next 0day might rely on Bluetooth being enabled even while the phone is locked, for example. Or even the cell radio being enabled. There's always some vector for attack.

(Even if it's none of these, the next exploit might be "we can decap the secure enclave and read/manipulate data on it with an electron beam")


Isn't one of the main sources of data the iCloud backups?


Of course, from TFA;

"The encryption on smartphones only applies to data stored solely on the phone. Companies like Apple and Google regularly give law enforcement officials access to the data that consumers back up on their servers, such as via Apple’s iCloud service. Apple said that since 2013, it has responded to more than 55,000 requests from the United States government seeking information about more than 208,000 devices, accounts or financial identifiers."


When Apple announced iMessage in the Cloud at last year's WWDC the headline security feature was that the data would be stored on iCloud encrypted with a key that your devices would share with each other, but that Apple did not possess.

https://motherboard.vice.com/en_us/article/d3zdqy/apple-is-t...


That is an insane number. Jesus.

Why don't they encrypt the data with my apple id password? Such a big security risk.

It seems like hacking the iCloud servers is a much more obvious way for an attacker to get data.


This has come up before and the current thought is that Apple won't do this because people themselves are idiots and forget their passwords and ask Apple to regain access to their iCloud data. Obviously if it's encrypted with the password Apple would not be able to allow them to regain access.

I don't see any reason why encrypting it couldn't be a toggle switch, not on by default, though.


When I was doing forensics for the police, 9/10 times if we had an iPhone we couldn't get into there'd be an unencrypted iTunes backup. Didn't even need to go to Apple for it, it's all local.

Wouldn't get everything from it, iirc it's photos, bookmarks, contacts and documents as well as some app storage (WhatsApp, notably).


Surely the window on this is rapidly closing. I haven't synced my phone to my computer in years. Can't imagine I'm alone here...


Seeing that the last time that the only way you could back up your phone was via iTunes was before the introduction of iOS 5 in 2011, the chance of finding someone with an iTunes backup is slim.


Plenty of people are cheap and don't pay for iCloud storage which means if you want any backup at all it's local.


I have four devices on my account and I can back them all up with the 5GB free account. It doesn’t back up your apps - just your data


That's great for you, but most people have enough photos and videos that 5GB isn't enough for a single device, let alone four. For a similar reason this is why many people complained that Apple continued to ship 32GB phones for so long, it's just too small for most people.


I use Google Photos and iCloud photo syncing. Pictures and Videos are automatically downloaded to my Windows computer.

https://support.apple.com/en-us/ht205323


Didn't realize app storage would be in the iTunes backup, but I guess it makes total sense!


A smart criminal would disable cloud backup of a phone used in criminal activity.


It's not as simple as a single toggle. Even when you disable as much as possible, apple still receives telemetric data. And "disable it as much as possible" is no easy task... lots of individual switches and screens you need to navigate to get even close to it.


Location services and iCloud backup would cover most of it.


A smart person wouldn't be a criminal in the first place.


There have been plenty of smart people that were criminals throughout history.


Not all laws are just.


The more I think about this issue the less interested I am in the extreme of either side. The government shouldn’t have unfettered access to our devices, but I can’t think of any other product in the history of the planet that gave people the ability to hide information so completely that the government could never look at it. To those that argue that our phones are an extension of our minds, that is both a bad thing and fetishizing our phones. Finally, couldn’t I argue that my diary or journal is an extension of my mind (certainly more so than a phone)? Yet diaries and journals can and have been subpoenaed.


But the information being hidden is largely information that never previously existed. Specifically, a complete log of where you went, what you read, who you chatted to, possibly the contents of those chats, etc. It is a previously unimagined level of intrusion, which is perhaps justifiable once someone becomes a suspect, but not extending backwards in time for potentially an entire life.


The most ancient and trusted security mechanism was a secret whispered into a willing ear in an isolated place. No government has ever been able to intrude on that.


Phrasing it "extreme of either side" sends a message that you place either extreme in the same moral bucket.

Some people may argue that extreme personal protection is good and extreme personal vulnerability is bad, which makes these "extremes" quite opposite, making one highly desirable and another one not.

Plus who decided that subpoenas are a good idea, or law enforcement traditions are how things ought to be? Inquisition and slavery were also a thing until they were not.


> any other product in the history of the planet that gave people the ability to hide information so completely that the government could never look at it.

Any volume-level encryption?


Or anyone's who come up with a half-decent code that's stored only in one or two brains. There are various medieval alchemical manuscripts which have never been fully deciphered.


> an hour after the phone is locked [...] In order to transfer data to or from the iPhone using the port, a person would first need to enter the phone’s password.

An hour seems like a long time?

On Android (at least, on my device) the USB port is always charging-only. For data transfer you must always unlock the phone and accept a notification for MTP/PTP mode.


You don't have to hack into an Android phone if you are the police. The data is available to you in multiple places. (Including the phone.)

That's why this semi-adversarial relationship exists principally between Apple and Law Enforcement. Not necessarily Google or Samsung and Law Enforcement.


> On Android (at least, on my device) the USB port is always charging-only. For data transfer you must always unlock the phone and accept a notification for MTP/PTP mode.

Well this is obviously an exploit, not a feature. You have to unlock and specifically trust a computer if you plug your phone into one.

The situation on Android isn't as cut and dry as you make it, you can for example plug USB headphones in and they will work without unlocking the device.


for enough money, pretty much anything can be backdoor-ed if it hasn't already. The FBI, CIA NSA,. etc have a huge trove of 0-days for this purpose. It's like "I thought TOR made me anonymous" ha you thought wrong.


>In order to transfer data to or from the iPhone using the port, a person would first need to enter the phone’s password. (Phones could still be charged without a password.)

So how long before the NSA has it cracked with power signal analysis?


It doesn't work like that. This attack requires the user to enter the password WHILE the NSA is monitoring the power.


If we could ban parallel construction than any NSA exploit would be rendered worthless for domestic criminal prosecutions.


There's a ban against law enforcement planting evidence on innocent detainees. That ban hasn't really stopped police departments across the nation from doing that very thing.

https://www.nola.com/news/index.ssf/2008/09/city_settles_law...

https://www.nytimes.com/2018/02/06/us/baltimore-police-corru...

Etc etc etc

I'm fairly certain a ban on parallel construction would not do you much good.


Perhaps then its high time we encrypt all personal computers as well by default.


Ah yes. Windows 10 has a built in drive encryption. It should be enabled by default.

(little do they know, it sends the encryption key to microsoft)


If guns had the same safety mechanisms as the iPhone, maybe we'd see fewer shootings. You know, register a gun to an owner, not allow anyone but the owner to fire the gun, etc. (yes, I realize ownership is already registered, but you can circumvent registration through various means)


It's also simply not practical to apply these mechanisms to guns without making them less safe or reliable.

A gun must always work when needed for protection - it's not like software where it's ok to be "down", rebooting, or having battery troubles some of the time.


A better hardware/software equivalent for a gun would be something like an AED or an insulin pump.


> making them less safe

That's a very spun definition for "safe". Any individual gun is far, far, FAR more likely to be used to commit a crime than deter one.


That depends on how you define "used". And furthermore, while it may be true for guns on average, it's not true for any individual gun. You are confusing individuals for averages.


Must it? I'd far rather a gun not fire than fire.


What if that gun is in the hand of a police officer protecting you or your loved ones?


What if that gun was in the hands of a loved one who was playing with it, thinking it was a toy? Or if it was in the hands of someone with malicious intent toward my loved ones?


I don't think people use guns for self defense.

http://www.latimes.com/opinion/op-ed/la-oe-0804-hemenway-def...


Whether or not owning a gun makes you statistically safer is another matter entirely from whether or not guns can be used for self defense. There is no shortage of real world examples of guns being used for self defense. So objectively they are used for self defense. They're also used for murder, hunting, target practice, looking nice on a shelf, etc.

Incidentally, that article includes suicide. Leaving aside the issue of whether or not somebody should be permitted to make such a decision about their own life, statistics don't apply to individuals in the way you are implying. While you could determine that owning increases the rate of successful suicide in a population, you cannot say it makes that man right there, named Bob more likely to kill himself. Because you don't know anything about Bob. Bob may be the sort of man who never even fleetingly contemplates suicide once in his entire life. If Bob were such a man, owning a gun would not make it particularly likely for Bob to commit suicide.

Furthermore whether or not Bob trusts his wife to not murder him with his gun is a matter completely removed from whether or not Bob ever uses his gun in self defense, or whether he owns it for self defense but never uses it for self defense (the later being orders of magnitude more common.)


The article I posted appears to disagree that guns are used for self-defence. Perhaps there is an illusion of safety, or there are unreported cases that discredit the article (I don't doubt), but when you have burglars robbing homes specifically because they contain guns, you have to wonder whether gun ownership provides real security.

I feel like we're getting off on a tangent regarding Apple's decision to further strengthen the privacy of a device. Happy to discuss in another forum the practicality of gun ownership.


> "The article I posted appears to disagree that guns are used for self-defence."

It doesn't say that, but perhaps it appears that way to you. You read it that way because what you seem to have is a fundamentally dehumanizing ideology that reduces individuals to averages. Something I've found to be characteristic of those with extremist political beliefs.


But any one can use any Iphone to call 911. It doesn't block that function. In fact any old cell phone will likely be able to call out to 911 if it has a single. regardless if it has a paid cell plan or not. My point being that in an emergency situation a cell phone works. If it didn't that would be considered unacceptable. Anything less than that for a firearm is also unacceptable.


Most things don’t have emergency overrides. There isn’t a special button you can press in a car that will let you drive it without the keys in an emergency. If you’re stuck outside a locked house and a bear is about to eat you, you’d better hope somebody is home and willing to let you in.

Phones are the only thing I can think of that do support this, probably because they can draw a clean line between “emergency” and “non-emergency” use.


Similar to another comment, guns are not used for self defense.


What do people use for self defense? Karate alone?


Not sure if sarcasm, but you can use mace, a sound horn, or just yell (surprisingly effective for women when confronted with an assailant). Take a look at studies related to gun ownership and physical safety and you'll see that gun owners don't use their firearms against aggressors as much as on themselves or their relatives. In other words, it's safer not to own a gun than to own one.


Computer owners don’t use their computers for app development as much as for playing games.

If I want to develop apps, should I get a computer?


Gun ownership is not registered, except in a few states.


I always see headlines about this for iOS. What about android? Is it crackable?


Yes, easily.


Is that the one they use to remote exploit iphones? Or is that a backdoor?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: