Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No one remotely disabled anything. There's a certificate deployed with Firefox. The certificate Firefox used to check addons was only valid till yesterday. So, when the browser started next time it couldn't validate the addons and disabled them. That all happened locally.


You could say it was remotely disabled by design. What other piece of software randomly just breaks because of the calendar date? I can boot up almost any 20 year old piece of Windows software and it'll work fine, it might not make sense in the current world but it won't go "2019? Fuck off!"


> I can boot up almost any 20 year old piece of Windows software and it'll work fine, it might not make sense in the current world but it won't go "2019? Fuck off!"

Is that really true? Would it connect to 802.11m WiFi router? Would you consider it secure enough to open your banking website on it? The bar is not just booting up the machine. The bar is whether the machine is usable (secure).


> Would it connect to 802.11m WiFi router?

Sure. It's using OS networking APIs. Or running in a virtual machine.

> Would you consider it secure enough to open your banking website on it?

If I'm running 20 year old software, it's probably to interact with a legacy system. There are still businesses that run on like 486's with Windows 3.1. This is more common than you think!

> The bar is whether the machine is usable (secure).

The bar is whatever I WANT it to be, it's my machine, and it's pretentious of a software developer to assume they know what I'm using the software for and what my best interests are. For all they know I'm using the software in a museum, 20 years from now, about this era of computing.


> For all they know I'm using the software in a museum, 20 years from now, about this era of computing.

And then you'll simulate a time appropriate for the device/software. As a date before 2038 to not have unix time overflow. Or 2000. Or any other time specific bug.

Or how often did you have to "fix the internet" for one of your relatives because their damn CMOS battery died? Yeah, time seems to be quite relevant for trust.

And I don't even like mozilla enforcing signatures for addons that strongly, but people can go overboard.


My point wasn't that I want to run a museum, but that intentionally turning software into a time bomb is silly and adds very little security value. At least those other ways it happens accidentally.


> The bar is whatever I WANT it to be, it's my machine, and it's pretentious of a software developer to assume they know what I'm using the software for and what my best interests are. For all they know I'm using the software in a museum, 20 years from now, about this era of computing.

I think that's a reasonable point of view. However, for such users, it's best not to use software that's largely developed for masses who just expect the software to work. It might be best to just checkout the source code, and build your own binary. Sorry for being rude :(


And the reason you can install 20 year old windows software without caring about code signing certs is that 20 years ago nobody bothered to sign code.


Not every piece of code needs to be signed. Should my ancient copy of Doom 2 stop working because it's not with the times? Or a level editor for it? Or an old turboC compiler?

Some software lives a LONG time and it's fine, and it's up to the user whether that software is still useful to them or not.

Seriously how many posts do we see on hacker news about like "We rebuilt this ancient machine from the 1970s to learn about it." People care about computing history. Not everyone, but there's no reason to force your software to break because the calendar rolls over. Remember Y2K? Things often live a LONG time. People are still actively writing Fortran and COBOL. The short-sightedness of this is amazing, as is the condescending "we know what's best for you" security argument.


Typically software signatures are not just pass/fail, but used to give audited entitlements for API.

So a secure system would let you run Doom, but could forbid:'

- Access to the filesystem outside the application domain due to potential for exfiltrating or destroying user data

- Likewise, access to global system data may be limited

- Access to the network due to (raw TCP/UDP) traffic not having been audited for security, and the network connectivity being usable for exfiltration

- Access to run full-screen due to the ability to perform user phishing attacks by presenting fake UI.

- The ability to disable system-registered keyboard sequences (on windows, such as the windows key or sticky keys)

- Access to mouse events outside its window

- Access to key scan data, although this likely will be emulated

- Access to change display color modes to e.g. 256 color indexed, although this will likely be emulated as well


Right but that's why we have sandboxes and virtual machines. There's no need to use a calendar date to enforce security.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: