Hacker News new | past | comments | ask | show | jobs | submit login

To patch the vulnerability, they need to be aware of the vulnerability. An individual who possessed such a vulnerability would likely be more inclined to go into business for themselves (such as the hackers that helped the FBI crack the San Bernardino iPhone, or the Israeli firm Celebrate) than hand it over to Apple for a one time fee. Although I imagine that Apple would probably pay pretty well for it.

IIRC these sort of vulnerabilities don't totally bypass the phone's lock mechanism, but rather disable the pin code attempt delay and allow bruteforcing of the pin code via software.




One time fee? Apple don't pay for such things. Possibly why there is a black market for iOS exploits.



The bounty security program (announced at BlackHat 2016) was created to deal with these kinds of scenarios. They will pay depending on the severity of the bug and the affected subsystem.

Of course, now that this mechanism exists, I'm just waiting for Apple to sue GreyKey and Cellebrite out of existence, confiscate all the devices, and charge the founders with aiding industrial espionage or overreach related to pursuing terrorism.

(I'd also like to see the same thing happen with the NRA, but alas that doesn't seem to be in the cards for the current circus in Washington)

The difference between more legit researchers and these guys is that they will work with anybody as long as they cut a check. Real R&D has more scruples than to do that.


As much as I dislike this Israeli firm how have they done anything illegal? Hacking a device in your physical possession should not be illegal. Your comment about the NRA makes me think you're just being hyperbolic here?


Celebrite is actually the better of those out there, while they do sell their data acquisition terminals to LEO in bulk their "unlock services" are done in person by their staff with a court order for each case (including multiple court orders in some jurisdiction when different datasets on the phone are protected separately by law).


It makes me extremely nervous to see that a third party can even create this capability. If I had a say, I would just tell these guys this level of access to low level code just isn't possible for third parties.

As odd as it might be, I trust Apple more because they don't want my data and aren't enabling methods for other people to acquire it.

On aome level, this back and forth on encryption is an endless cat and mouse charade, but the fundamental assumption behind cryptographic security is absolute.

"You can't outlaw math"


What are you talking about? You want to make it so that a company like Apple can just draw arbitrary bounds and say "no messing around beyond this point" and have that be internationally, legally, enforced?

We got that with the DMCA and DRM modules, phone unlocking, and console rooting.


Companies can write nigh-any clause into their EULA or T&C and people generally have little recourse. There still seems to be enough wiggle room legally because they control the platform. Some places (think EU) fight this, but I don't think it's in any way settled at this point.

They've done this in the past in subtle ways - cautioning developers about using private API, which they reserve the right to change at any time, thus breaking applications. For a practical example, Google "Apple kext signing certificate". It's not simply a matter of paying $99 and off you go, the barrier to entry is higher.

There have also been no-so-subtle warnings - see Charlie Miller's blacklisting - that even a proof of concept for a bug is not allowed because it could get out in the open and cause widespread damage.

> We got that with the DMCA and DRM modules, phone unlocking, and console rooting.

Record labels had little choice and needed to ditch these restrictions in order to have a viable business. TV studios, cellular providers, and console makers fight to this day to preserve these limits as a means of competitive differentiation.

I'm not saying I agree with it, but that is still largely the reality we have to deal with.


Encryption is useless if there is a way to brute force crack passwords or acquire private keys which could be used to decrypt a device. In most ways, yes, physical access is 'game over for security, so I could see Cook & co. make the case that these devices are so dangerous in that regard that they shouldn't be able to be used by anyone.

Tl;dr It's covering your butt when Apple can say to the FBI, 'We support your efforts and want to help you, but literally cannot because it would require dismantling our own security architecture.' No engineer would agree to that - they would quit in protest.

The NRA crack is more about them being the GOP's puppeteer. Under a competent (and probably Democratic administration), being labeled a domestic terror organization would kill their funding in 0.02s.


>I could see Cook & co. make the case that these devices are so dangerous in that regard that they shouldn't be able to be used by anyone.

OK but that is not a legal strategy. You're not providing any basis other than that Apple should have some magical power to prevent people from touching devices they legally have access to.

OK and if you got Planned Parenthood or the Humane Society listed as a domestic terror organization it would hurt their funding, too. What's your point?


How is the NRA vulnerable to these machinations?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: