Hacker News new | past | comments | ask | show | jobs | submit login

There are cases that I'm all for bashing Google when they don't give the company they're targeting enough time to patch something (recently, seems mostly directed at Microsoft). This isn't one of those cases.

They seem to have waited until Apple had a patch ready, they disclosed it to Apple and gave them an adequate amount of time to patch the vulnerability, and users are better for it.

So in this case and others similar to it, kudos to Google.




>There are cases that I'm all for bashing Google when they don't give the company they're targeting enough time to patch something

While I understand the common ethos of our current culture supports this, has there been analysis if giving what could constitute a second chance to fix security issues leads to less prioritization of security initially? I could definitely see a business deciding to lower their security expenditure since if an issue is found, they will be given a grace window to fix it before the world hears about it. It would still be damaging, but it would be far less since the PR machine could spit out that it was patched before it was announced to the world.

There has to have been some agreement to limit the grace period since people will go live once a reasonable time frame to fix it has passed and they won't be judged negatively if others agree reasonable time was given. So if we won't judge someone for giving only 6 months instead of 3 years, what about the one who gives only 2 weeks instead of 6 months? How do we calculate which of two time frames is better?


I imagine they share proof of concept 100% of the time, and if that is the case, I’d say it varies: target a window, say 2 months. At that point, show progress on the bug to Google (or whoever). If at the 2 month mark it is obvious it was low priority and not really looked at, the vendor of the application failed in which case I would say disclose away (bonus points if they provide something to mitigate it, if possible, though onus is not really on them either way). If they can tell the software vendor is making progress/genuinely attempting, then I’d say an extension would be fair.

In the Microsoft case that vaguely comes to mind, I believe the issue was one that required a bit of work because it was pretty low level for Windows. I want security patches on my system ASAP, but I also don’t want someone to release something that breaks my OS’s functionality or renders my files (or the ability to open files) fubared either. If memory serves, they were making progress on it, but it went past the time period Project Zero set and they were unwilling to give an extension and as far as was reported, didn’t seem to be exploited in the wild. But then you have something unpatched that is disclosed by Google. That doesn’t help users all that much.

That is all to say it isn’t being verifiably exploited in the wild. When that is the case, that changes things to the point users need to be made aware as soon as possible and if it means “turning off” a feature, if possible, as a stopgap, give that info to them.


If only Google would hold themselves accountable to the same standard. Android is a gigantic security mess, all caused and enabled by Google.


No it isn't? Android has a bug bounty program: https://www.google.com/about/appsecurity/android-rewards/

and regularly has strong showings at pwn2own. Android's security for the past couple of years has been superb.


Android as an abstract project, yes. Android, as what's actually used by users, it's not that superb.

Google is slowly trying to fix it, but average Android device is way behind average iOS device in the wild, and that will be the case for many years to come.


> Android, as what's actually used by users, it's not that superb.

It is, though. The Android that's most commonly used by users is the one from Samsung, who also issues monthly security patches for a large range of devices: https://security.samsungmobile.com/workScope.smsb

LG ( https://lgsecurity.lge.com/security_updates.html ) does as well, and so do at least Motorola & Nokia.

> average Android device is way behind average iOS device in the wild, and that will be the case for many years to come.

[citation needed]

Average iOS device just got hit by 2 zero-days in the wild. And jailbreaking is a long and well established practice on iOS, which is literally privilege escalation exploits. There's a constant, continuous stream of those on iOS. There doesn't seem to be many (any?) on Android for a while now.


>There doesn't seem to be many (any?) on Android for a while now.

To be fair, there are a variety of reasons why this isn't the case that have nothing to do with security. An Android jailbreak is less valuable for a few reasons, among them that you can often purchase android devices with root privs, the same isn't possible for iphone.


It's one thing to release a security patch. It's a different thing to get it installed on user devices. If a user never has an opportunity to install the patch, that patch might as well not exist from that user's standpoint.


There are millions of unpatched Android devices, probably forming a massive botnet by now. When you read it in the news sometime in the future, remember this post. You read it here first.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: