Hacker News new | past | comments | ask | show | jobs | submit login
Update Regarding Add-Ons in Firefox (blog.mozilla.org)
455 points by akyuu on May 4, 2019 | hide | past | favorite | 466 comments



I have a bunch of privacy-enhancing addons installed, which have now all been disabled. If I hadn't read HN this morning, I wouldn't even have known why. Until now, I had no idea that it was even possible to remotely disable my addons.

And now Mozilla are saying that the "fix" is to allow them to install & run "studies" on my machine? What are they smoking? I'm having a hard time trusting a company that randomly & remotely disabled all my addons, regardless of the cause.


This is not entirely accurate. Nothing was done remotely to disable the add-ons. It happened locally. A certificate that's on your machine as part of the Firefox install expired. When that happened, add-ons that were signed via a cert chain that included the expired one started appearing to be invalidly signed. And that's why it requires an update to completely fix. That part is remote, because they need to push a new valid certificate to you to replace the old one.

I do think that the UX should ideally be a bit more graceful; one of my add-ons is Multi-account Containers and its being disabled suddenly caused the window I was actively browsing in to just close, among other side effects.

But that kind of UX polish for what should be an exceptional case is obviously not going to be super-high priority, unfortunately.


[flagged]


> And to the downvoters: doesn't this entire fiasco ENTIRELY PROVE MY POINT?

No.

All it proves is that certificates expire (which is a Good Thing (tm)). If you depend on online certificates to verify content, something like this can theoretically happen.


So you think Mozilla is enjoying this right now? And that this is going to help the perception and market share of Firefox?

Hypothetically, lets say they took the opposite approach, and only checked the certificate date on installation. What would have happened? There would have been a brief period of time where people couldn't install extensions, it would have been fixed in a few hours, and this story would probably have like 20 upvotes and fallen off the front page in like 10 minutes, if it ever got there in the first place.

Now let's briefly look at what's actually taking place: a bunch of people's browsers broke. It broke in scary ways for some people that were using extensions for privacy; IE, their security might have been compromised by this decision by Mozilla. Mozilla is probably going to lose users over this. Their reputation is damaged. Not only that, but now people are evaluating other decisions that Mozilla has made separately from this in an unfavorable light (Studies and Normandy, specifically). The computing industry loses out on this too: we're all better off for chrome having a viable open source competitor. We should want Mozilla to do well, whether you use their browser or not.

Which outcome do you think Mozilla engineers would be preferring today?


Not only that, but now people are evaluating other decisions that Mozilla has made separately from this in an unfavorable light (Studies and Normandy, specifically).

Sounds like a good thing. Probably sounds like a good thing to some of the engineers at Mozilla.

The computing industry loses out on this too: we're all better off for chrome having a viable open source competitor.

I want a free competitor, not an open source one. And this is the most prominent example of that somewhat subtle distinction: Firefox is open source, but with all these backdoors it's no longer free, neither in spirit nor in practice.

P.S. On the other hand, it's probably indeed better to have at least two serious competing browsers, even if they are both non-free.


I do agree; although, it would have been nice for this discussion to come up without things breaking like this.


You are putting words in my mouth, please stop that.

Instead of disabling this historically working feature which normally works great against hostile attacks such as MITM and malware this problem would've been avoided by a simple cron script which runs daily, and checks for expired certificates used within the infrastructure (both interaly used and externally used).

The main competitor you mention, Google Chrome, is a terrible privacy hazard. This situation does not change that.


I'm not putting words in your mouth, you literally said this is a "Good Thing (tm)", because of hypothetical security reasons. Whereas I'm literally saying that this actually broke real privacy extensions, broke peoples software, and badly damaged their reputation.

As to your argument about security: doing the check on install instead of all the time, as I suggest is preferable, still protects against MITM and malware. The only argument for expiring software based on a calendar is that it might be a security risk if it's out of date. But first off, that is highly dependent on what the extension actually does. Also clearly we can see it's also a security risk to turn off privacy extensions without warning based on an arbitrary signing certificate.


Things might change after install. Its a Good Thing (tm) to periodically recheck, or check revocations.

> > I'm not putting words in your mouth [..]

Yes, you did:

> "So you think [..]"


[flagged]


What about the first part I wrote? Do you find CRL a useful feature in CAs/WoTs?


Just because a cert expires is not a valid reason to disable functionality with no override available to the user. You may already know, you can override when visiting a website with an expired cert (once or forever). Yet nobody at Moz seemed to think it a good idea to allow it for extensions. Great.


Fair point, hopefully they'll learn from this mistake and ensure the user can easier adapt. I expect them to, since I have Mozilla high in regard.


You obviously never used Windows, so you’ve never encountered them restarting just because of an update.


I have and it still infuriates me :-)

I also did a bunch of sketchy registry hacks to fix that. (I keep up to date, but I don't need microsoft restarting my system at random hours)


No, I totally agree with that (I just edited to expand on this point). It was a bad user experience.


Well, I disagree that it's a UX issue. The problem is they shouldn't be expiring local software at all. If it was trustworthy at the time of install, why should the calendar date matter?

If users don't want to stay up to date, that may be unwise, but that's there call.


Because if the cert is (for example) actively revoked, that probably means you should stop trusting things signed by it in the past.


But revocation is different from expiration.

If you're worried about a cert being compromised long after expiration and used to back-date a signature, you can show a warning, add a second signature with a local key at time of installation, use a blockchain to prove age, have a timestamping service generate its own signature at time of generation, etc.


Sure, my point was mostly that there are reasonable cases in which one might want to evaluate the signing of an addon after the fact.


That might make sense for TLS certificates where the server can change their content on a whim. A signed and installed addon has been trusted possibly for years, the need for sudden revocation is not dire enough compare to the price you have to pay for that (this failure, always-online requirement)


Revocation should handled differently than expiration. Two entirely separate things.


Clearly, downstream distributors need to create a patch which causes their distributes Firefox builds to only check certificates on add-on installation (and to check revocations too, sure): it should never be possible for a browser to fail into an unsafe configuration.


Cert expiration is the only safe revocation. You cannot rely on revocation lists in many settings. Access to them might be maliciously blocked or, if locally kept, tampered with. The list could for example be replaced with an older one, which would circumvent signing the list unless the signature contains an expiration date and then you’re back to “oh, list expired, how do we fail?”

You cannot rely on check at extension install. That would assume that all malicious extensions are installed via FF proper. Oracles crapware bundling in the Java installer taught us that’s now how things go. You cannot remember the trust flag when an extension is installed via FF as a crapware installer could just set the trust flag, too. After all, that storage would be accessible, too. You cannot sign or encrypt that trust storage as the key material would have to be kept locally and would be accessible to the crapware installer.


Well, frankly, I don’t really want a revocation list, and I don’t really want signed extensions in the first place. It’s my browser, and it’s not Mozilla’s business to decide what I install on my browser.

And I most definitely don’t want my browser to fail into an unsafe configuration.


What's the difference to the user between "to check revocations too, sure" and what happened?


revocation != expiry


checking revocations lists != certificate expiry


Revocation lists are often tied to certificate expiry, purging entries that are no longer valid due to expiry.


No, revocation lists are orthogonal, often used to invalidate a certificate before it expires.


At a the technical level of cert handling, yes. To the user? Not so sure.


I enjoy a nice cup of outrage in the morning just like the next guy, but this one is really weak and lacks that fresh taste of evil conspiracy that I really crave.

You use a browser that has remote update capability, which allows them to install and run new software on your machine all the time. There is a whole separate section of the Preferences that says "Privacy" in large print that has a section that clearly identifies the Studies feature and lets you turn it off. And you use a browser that lets you install privacy-enhancing add-ons in the first place, and in fact which invented the whole concept of add-ons. When the browser discovered that it couldn't verify the add-on integrity with a valid cert, it did what it's supposed to do, it disabled them to protect you from someone backdooring these add-ons.

Someone at Mozilla fucked up, and they're trying in good faith to fix it. I don't know what else people are expecting them to do, putting on sackcloth and ashes won't resolve the problem.


Here's the thing, though: yes, we most certainly are giving them a lot of trust by allowing them to install software on our machines. Which means outrage when they screw up is totally justified, because they broke that trust.

Here's a metaphor: Let's say you let someone seemingly trustworthy watch your kid. (In this metaphor you have a kid). And they let your kid get a broken arm through gross negligence (let's say they passed out drinking beer), and then someone said "well, obviously, you should have never trusted that person, after all, they can do anything with your kid while you're gone, so why are you outraged?" You probably would still be pretty outraged right? You would certainly question your decision to trust them, but at the end of the day you have to trust someone, you'd be a complete shut-in if you could never hire a baby-sitter.


Studies is the same shit they used to push the Mr Robot crap a few years ago.


Your addons have not been remotely disabled. They were marked as trustworthy by a certificate that expired and thus are no longer considered trustworthy. The effect is similar, the mechanism is different. You could also enable loading of unsigned extensions, that would “fix” the issue, too.


>You could also enable loading of unsigned extensions, that would “fix” the issue, too.

Which is impossible unless you're either running Linux or running Nightly or Developer Edition. That setting is willfully ignored in normal Mac/Windows/Android builds most people are on.


They were effectively remotely disabled, there was a hidden dead-mans handle that's been triggered in order to effect the result; but it's logically equivalent from an end user perspective -- an external agency caused my add-ons to be disabled without my authorisation.

"A certificate chain has expired, do you want to disable all add-ons?"

How hard is that?


> there was a hidden dead-mans handle that's been triggered

The add-ons were signed by a certificate with an expiration date, which means that the add-ons are trusted until that certificate expires, not that they're trusted in perpetuity. It's not a hidden dead-mans hand; expiry is and has always been part of the process.

I think it's arguable that it shouldn't be part of the process, and having things like 20-year expiry satisfies the letter of the spec while being even worse than no expiry, but it's not a hidden dead-man's hand. It's how it was designed to work, and isn't considered optional.


> How hard is that?

If you think that’s trivial, I challenge you to go build it. It might seem warranted in hindsight, but thinking about all failure cases ahead of time is hard. If it weren’t, we’d not have bugs.


Cool, yeah, noone can complain unless they can personally do it all better. Do you think that's workable?

I don't think it's trivial. The critical element here appears to be "who gets the final say" and not "this is to hard to code".

They manage to disable the "Enable" button for addons, and managed to consider this situation enough to provide a justification that (paraphrasing) "we do this when we don't want the add-on installed", which is harder to do, they've added extra tests, added complexities and done all the consideration. They've just chosen to remove the final say from the user and give it to themselves.

It seems consistent with their recent behaviour.


No, you can complain. I specifically object to “How hard is that?” which is an entire class in itself. Stuff often is inherently hard and when you don’t know internals of a project and don’t work on it, you may have no concept of how hard it is. Don’t pretend you do. Saying “How hard is that?” carries the notion that everyone working on that thing you don’t know is sloppy, malignant or stupid.

“I wish it would do that.” is a much more charitable way to phrase your complaint.


Are you saying you don't think that Mozilla have the capabilities; I'm not. My "how hard is that [for Mozilla]?" is specifically "I think they have the capabilities but chose differently, but if I'm mistaken and there is a technological bar to this then please correct me". That's why I didn't write "That's easy!", nor "How hard is that!", but used "how hard is that?" -- the implication is that's not hard for them to do so why did they chose to do it differently.

As it happens I've just had to flip "xpinstall.signatures.required" and it's working for me. So it seems "not at all hard [for them]" was the answer.

FWIW people have chipped in saying this specific issue was raised, so it's not that they hadn't conceived that such a situation could occur (indeed that's surely why the config above exists).


No one remotely disabled anything. There's a certificate deployed with Firefox. The certificate Firefox used to check addons was only valid till yesterday. So, when the browser started next time it couldn't validate the addons and disabled them. That all happened locally.


You could say it was remotely disabled by design. What other piece of software randomly just breaks because of the calendar date? I can boot up almost any 20 year old piece of Windows software and it'll work fine, it might not make sense in the current world but it won't go "2019? Fuck off!"


> I can boot up almost any 20 year old piece of Windows software and it'll work fine, it might not make sense in the current world but it won't go "2019? Fuck off!"

Is that really true? Would it connect to 802.11m WiFi router? Would you consider it secure enough to open your banking website on it? The bar is not just booting up the machine. The bar is whether the machine is usable (secure).


> Would it connect to 802.11m WiFi router?

Sure. It's using OS networking APIs. Or running in a virtual machine.

> Would you consider it secure enough to open your banking website on it?

If I'm running 20 year old software, it's probably to interact with a legacy system. There are still businesses that run on like 486's with Windows 3.1. This is more common than you think!

> The bar is whether the machine is usable (secure).

The bar is whatever I WANT it to be, it's my machine, and it's pretentious of a software developer to assume they know what I'm using the software for and what my best interests are. For all they know I'm using the software in a museum, 20 years from now, about this era of computing.


> For all they know I'm using the software in a museum, 20 years from now, about this era of computing.

And then you'll simulate a time appropriate for the device/software. As a date before 2038 to not have unix time overflow. Or 2000. Or any other time specific bug.

Or how often did you have to "fix the internet" for one of your relatives because their damn CMOS battery died? Yeah, time seems to be quite relevant for trust.

And I don't even like mozilla enforcing signatures for addons that strongly, but people can go overboard.


My point wasn't that I want to run a museum, but that intentionally turning software into a time bomb is silly and adds very little security value. At least those other ways it happens accidentally.


> The bar is whatever I WANT it to be, it's my machine, and it's pretentious of a software developer to assume they know what I'm using the software for and what my best interests are. For all they know I'm using the software in a museum, 20 years from now, about this era of computing.

I think that's a reasonable point of view. However, for such users, it's best not to use software that's largely developed for masses who just expect the software to work. It might be best to just checkout the source code, and build your own binary. Sorry for being rude :(


And the reason you can install 20 year old windows software without caring about code signing certs is that 20 years ago nobody bothered to sign code.


Not every piece of code needs to be signed. Should my ancient copy of Doom 2 stop working because it's not with the times? Or a level editor for it? Or an old turboC compiler?

Some software lives a LONG time and it's fine, and it's up to the user whether that software is still useful to them or not.

Seriously how many posts do we see on hacker news about like "We rebuilt this ancient machine from the 1970s to learn about it." People care about computing history. Not everyone, but there's no reason to force your software to break because the calendar rolls over. Remember Y2K? Things often live a LONG time. People are still actively writing Fortran and COBOL. The short-sightedness of this is amazing, as is the condescending "we know what's best for you" security argument.


Typically software signatures are not just pass/fail, but used to give audited entitlements for API.

So a secure system would let you run Doom, but could forbid:'

- Access to the filesystem outside the application domain due to potential for exfiltrating or destroying user data

- Likewise, access to global system data may be limited

- Access to the network due to (raw TCP/UDP) traffic not having been audited for security, and the network connectivity being usable for exfiltration

- Access to run full-screen due to the ability to perform user phishing attacks by presenting fake UI.

- The ability to disable system-registered keyboard sequences (on windows, such as the windows key or sticky keys)

- Access to mouse events outside its window

- Access to key scan data, although this likely will be emulated

- Access to change display color modes to e.g. 256 color indexed, although this will likely be emulated as well


Right but that's why we have sandboxes and virtual machines. There's no need to use a calendar date to enforce security.


> And now Mozilla are saying that the "fix" is to allow them to install & run "studies" on my machine? What are they smoking?

Can you elaborate what's your concern with "studies"? By installing Firefox that updates automatically, the user is already giving control of the software and letting Mozilla decide what's the best. How is modifying software logic using studies different than modifying logic by updating the binary?


Studies are installed and their data sent to Mozilla without my knowledge or consent. Updates are installed with my consent. Pretty big difference.


Oh, I did not know that. Are you sure that enabling studies also means that user's data is sent to Mozilla without consent? Are you sure about this?

I'm asking because there I can imagine that there is a benefit for Mozilla to develop a feature that enables studies without sending data. It could be used to fix a broken feature or a broken logic (as in the case of expired certificates here). So, I'm not convinced that enabling studies always means that your data gets uploaded without the consent. Can you point me to privacy whitepaper / source code to backup that statement?



They have not remotely disabled addons. The certificate expired and the addons did the correct thing when connection couldn’t be established. Nobody triggered a switch to disable addons.


If you can disable all my addons by having a certificate expire, you can effectively remotely disable all my addons. And that's exactly what happened. The fact that this was (presumably?) not intentional is irrelevant. The switch may not be an actual switch, but it's there nevertheless. And it shouldn't be.


So you are saying there should be no way for an installed addon to fail an integrity check?


I think Tharkun is saying that there should be a (reasonably accessible) way for a user to choose to override the check's failure. Which is not that far-fetched as a proposal.

It's a classic and ongoing debate of "who knows best?" -- the vendor, or the end user?


Non sequitur. An installed addon could be signed with a certificate. Mozilla can push out a revocation for that certificate if it deems it appropriate. The revocation can pop up a modal telling users it wants to disable the addon. The user can click 'Disable addon' or 'Ignore (dangerous)'.

This doesn't need to be done through a dead man's switch (expiring certificate) that someone will forget to renew.


Yes.. the choice should be taken to the user- and the users choice should trump them all.


If you want full user choice about this you can use nightly or use a community packaged version with the option to disable the checks enabled or compile it yourself and enable the flag. It's not impossible to do that.


Due to them easily being able to push code without much hastle using Studies, I think this is an elegant-ish solution to a problem that shouldn't even have happened (expired certs are something that's entirely avoidable), but errors happen.

Eorum est humanum.


Not initiated by user != remotely initiated. And agreed no the fix side, they should have posted how to do it yourself as well.


The problem is that Firefox does not have sufficient built in privacy settings by default. Users shouldn't have to crawl the internet for lists of recommended addons, then have to trust such a variety of authors, to have basic privacy. Like I said elsewhere, I'm using Brave because of this.


I'm also using Brave now, though I did have a quick yearning to dust off Lynx…


At the minimum they should add a testsuite that runs at least a month into the future to catch these kinds of things.

There was a similar issue[0] a few years ago that was only caught a month in advance.

Even better would be to set things up to only do a verify on install instead on every startup.

[0] https://bugzilla.mozilla.org/show_bug.cgi?id=1267318


> Even better would be to set things up to only do a verify on install instead on every startup.

That would defeat the purpose of verification: "Add-on signing in Firefox helps protect against browser hijackers and other malware by making it harder for them to be installed." [1]

And it's not just malware that was doing that. Microsoft force-installed the ".NET Framework Assistant" into Firefox on Windows, and you had to edit the registry to remove it. [2] If I recall correctly, AVG and Logitech were also among the list of offenders.

[1] https://support.mozilla.org/en-US/kb/add-on-signing-in-firef... [2] https://support.microsoft.com/en-us/help/963707/how-to-remov...


Those still would have their certificates checked on installation.

And honestly, I think it is security theater to attempt to defend against attackers on the same or higher privilege level. If microsoft wants to force something down your throat on windows then there's not much you can do.

The problem is that mozilla turns the failures of others into their own problem and then they try to fix it themselves. That scope and responsibility creep leads us to the fallout we're seeing now.


> Those still would have their certificates checked on installation.

How? These extensions were not being installed through the normal mechanism. The malicious extension installer will just set the flag that says "this extension has been verified".

> And honestly, I think it is security theater to attempt to defend against attackers on the same or higher privilege level.

I understand that, and Mozilla does too: "By baking the signing requirement into the executable these programs will either have to submit to our review process or take the blatant malware step of replacing or altering Firefox." [1]

[1] https://blog.mozilla.org/addons/2015/04/15/the-case-for-exte...


But that's the point. Either the installer does something malicious or it doesn't. If it does you lost the game. If it doesn't then a simple check is sufficient. Everything else is security theater which makes life worse for everyone.

Also, they could still run the verification and prompt the user instead of just forcing the decision.


I don't think that's necessarily true. After all, the policy is effective against undesirable-but-not-malicious extensions. Before signature verification I had extensions installed in Firefox that I didn't install; today I don't. [1]

And the clearly malicious action of modifying Firefox to disable signature verification can and should be flagged by anti-malware software, which runs at a higher privilege level.

[1] Putting aside for the moment the fact that most users now have no extensions installed due to the certificate expiration issue. No Firefox user, myself included, is happy about that.


> Before signature verification I had extensions installed in Firefox that I didn't install

... How?

I agree with the sibling poster; it sounds like you already lost.


I consider these to be mental acrobatics to find a position to justify wresting away any control from the user. It is not mozilla's responsibility to attempt to protect the user from the very slim line of "effectively malicious but still somehow principled" malware, picking a near-by line of verifying once would be far less problematic.

If the user does not want that crap on their machine they should remove the origin instead. We would not have the current situation if mozilla did not assume responsibility and control for problems outside their domain.

At least they could have made this opt-in by asking the user if they want an extra locked down version of firefox that might disable their addons if they are deemed malicious. Then the user could have made an informed choice.


On a typical Linux install, the Firefox binary is not writeable by a malicious extension installer that runs with user privileges. Thus baking the check into the binary fully protects the integrity.


Then overrides could also be made configurable as root.


Could you explain why verifying on every startup, instead of just on install, is necessary? The page you linked doesn't mention it.

Edit: Let me amend my question - why is it necessary for the certificates to expire? If a plugin is signed by Mozilla, why wouldn't it be trusted once it gets old?


How do you propose Firefox should tell whether the unsigned, shady add-on was installed by the user, or by some other dodgy app messing with Firefox's files whilst it wasn't running to make it look like the user installed the add-on?


> Let me amend my question - why is it necessary for the certificates to expire? If a plugin is signed by Mozilla, why wouldn't it be trusted once it gets old?

I asked essentially that question earlier, and received some good answers explaining why [1].

Briefly, if something is signed by an expired certificate, whether or not you can trust the signature depends on whether or not the signing took place while the certificate was not expired.

If all you have is the thing and the signature from the code signer, you can't tell for sure when it was signed. If a bad guy has obtained an old signing certificate and its keys, that bad guy can generate new signatures that claim to have been signed while the certificate was valid.

Some code signing systems, such as the one in Windows (and I think the one Apple uses) also use another certificate, from a timestamp service, to prevent this. The way a timestamp service works is you send them a hash of a document, and they generate a certificate signed by them that essentially says "We were shown this hash on this particular date/time".

When you include the timestamp certificate with the signature from the code signer, then when you come across code that was signed by an expired certificate but purports to have been signed while the certificate was still valid, you can check the timestamp certificate to see if that is true. If it is, you can still consider the code signing to be valid.

Forgetting to renew a signing certificate is still bad even if you do this, but not as bad. If Microsoft or Apple forget, it doesn't stop existing applications from working, so end users aren't immediately impacted. It does stop developers from shipping updates or new applications, so still would be a big deal. I could see a bad guy noticing that a company always updates expiring certificates one month in advance, say, and then noticing that a certificate is expiring in just a week, infer that the certificate renewal has slipped through the cracks and is going to expire, and time the use of a critical zero day exploit to fall in the window when updates are broken by the expired certificate.

[1] https://news.ycombinator.com/item?id=19824017


I checked the hash functions for the xpi, its broken MD5 and broken SHA... which doesn't matter for a running instance if FF takes care to download it from their own servers over HTTPS... but in the attack model you describe it is retrieving the signatures from an untrusted disk.

https://news.ycombinator.com/item?id=19830228


There’s no guarantee that a (malicious or undesirable) extension was added via FF, so check on install could be circumvented by any of the crapware installers we all love.


> And it's not just malware that was doing that. Microsoft force-installed the ".NET Framework Assistant" into Firefox on Windows, and you had to edit the registry to remove it.

If it's not a malicious extension, verifying the signature doesen't prevent a forced install.


Verify-on-install doesn't protect you against addons that steal someone's signing certificate to push out an update, because by the time the stolen certificate is discovered, it will have been installed on a bunch of users' systems.


IMHO it seems problematic, that they can remotely push code changes, including replacement of trusted certificate, and bypass package managers.

I don't expect software to (significantly?) change during runtime, outside of what was packaged, signed, distributed and installed as part of apt/yum/pacman/etc.

I understand (not that I like or agree with) that some apps are just embedded web browsers, and load everything externally, and that Firefox extensions are in the end just some JS/CSS/HTML loaded outside of system's package manager. However, extensions have limited API they can interact with, and you need to allow permissions for each extension. Having Mozilla owned extension, that can modify core functionality, seems a bit scary.


If you didn't have browsers auto updating no-one would update them manually, meaning bad news for web developers wanting to take advantage of newer features.


> meaning bad news for web developers wanting to take advantage of newer features.

I think you spelled “mass compromise of unsuspecting users due to unpatched security holes” wrong.

Like it or not, browsers as the primary networked application that people use are the prime target to exploit users. They connect to unknown endpoints of questionable trustworthiness (unlike most other networked apps) and execute code loaded from there. They also handle people’s secrets such as credentials to Homebanking. We maybe shouldn’t be at that point, but here we are and browser vendors need to handle that responsibility. Quick auto updates are crucial for that. Expert users might dislike them, but let’s face it, we’re not the majority.


> Quick auto updates are crucial for that. Expert users might dislike them

I don't think anyone is really against quick security-related fixes being delivered with a degree of automation. What most power users dislike is mixing these updates with other ones (typically for commercial reasons).


What you want assumes having patches for every version that was ever released in the extreme case. How do you propose not doing so when you have limited resources? Firefox offers an ESR release, you can use that if you want.


What? They produced the fix, that’s not the problem. The problem is keeping the delivery mechanism separate from the telemetry/experiments delivery mechanism. Which it clearly was in the past, since FF has been pushing security updates forever. Why it couldn’t be done this time? Is it a sign of things to come? If yes, that is very shady from a privacy perspective and unsound from an engineering perspective.


You can get the fix without telemetry. You just have to wait until the update goes through the update channels, as usual. Going via telemetry just speeds up the process. What you asked for is something different: security updates without feature updates for your chosen release. Forever.


I understand the appeal of that for developers but it comes at the cost of users agency and control of their own system, I've been very annoyed with even simple UI changes in firefox updates as I simply didn't ask or want any such change. Reading other comments here it's clear I'm a dying breed of old and stubborn users that prefers full control and agency over my own system. Making it easier for web developers to implement new features is absolutely not a tradeoff I'd make willingly at the cost of my systems consistency and reliability. Also the reason I use firefox is because of all the major browsers vendors they seem the most aligned with those values although this seems to be changing more and more every year.


Mozilla/Firefox have the Extended Support Release (ESR) for you.

https://www.mozilla.org/en-US/firefox/organizations/


The incentive structures of society (capitalism, if you're so inclined, but I don't think this is unique to capitalism) are incompatible with your wishes.


Good news for users is sometimes bad news for developers. Anyway, too often these "newer features" are just new ways to exploit people or shiny add-ons without much societal value.


Software developers optimize for overall utility, not paranoia.


I wonder if there is someone out there in the middle of the ocean with a browser extension based communication and navagation system which is dead in the water?

It sounds to me that the real headline here is that every copy of firefox out there was timebombed and we only noticed because someone forgot to elongate the fuse.


Could other code signing systems like macOS gatekeeper also be vulnerable to problems like this?

IMO this seems like just plain bad design. The Firefox addon certificate should never have had an expiry date. If they ever needed to revoke it, they could distribute an updated version of the browser with the previous intermediate explicitly marked as revoked.


That is my biggest complaint. Only the Firefox Linux team of included a about:config option to turn it off. Android, Windows and mac have no way to do so. It's still broken on my phone. Wtf were they thinking?


Try

xpinstall.signatures.required

Works on Fennec version of Firefox.

Also IceCat version of Firefox wasn't affected AFAIK.


As far as I know that does not work on Mac and windows stable and not at all on android.


The browser itself continued working fine. Are you aware of any life-depending extension? Leaving this particular issue aside, your hypothetical "browser extension for people in the middle of the ocean" was doomed from its inception if it was designed to run as a browser extension (though it opens the door for an interesting discussion about similar scenarios that are happenning, like pilots relying on ipads)


> your hypothetical "browser extension for people in the middle of the ocean" was doomed from its inception if it was designed to run as a browser extension

Why? You haven't backed up that statement at all. Especially before they killed XUL it was easy to make a non-doomed app that runs as a browser extension, and it's still plenty possible.

No (non-demo) program should brick itself if it can't connect home.


There are _many_ applications that exist as browser extensions, including critical communications applications.

I don't personally know of any obviously life critical application done this way, mostly because I try to stay as far away from that sort of insanity.

If you don't think it's at least a plausible thing that could eventually happen you haven't been paying attention.

I personally got stuck stranded because of signals stupid built in timebombing when I was relying on a device with no untrusted third party ability to shove silent software updates for communication.


No but what about people who use password managers, that were locked out of not being able to access bank accounts, credit cards, and reddit.


That's a Debian free software principal, correct? The desert island rule?


I hate to say all these things because I use Firefox all the time, but...the communication around the add-ons issue has been poorly handled by Mozilla. I only learned of the problem by visiting HN. But what of the thousands of other users who don't visit HN?

If you visit the Mozilla homepage, there is nothing to acknowledge the problem (at least at the time of writing this message). Let's try the Support page. Where is it? Scroll down to the bottom of the lengthy Mozilla homepage to the page footer to find the link. (How many visitors will make it to the bottom?)

When you click through to the Support page, an easy-to-miss banner in tiny text appears at the top of the page that mentions the problem - screenshot here: https://imgur.com/a/TAHZSWa

Additionally, when the add-ons are disabled, Firefox misleading says: "These extensions do not meet current Firefox standards so they have been deactivated". This is probably a generic message but it's also an example when a generic message is misleading.

Finally, poorly-named settings like "Normandy" and "studies" that give no hint of their meaning only adds to the confusion.


The way I see it, people might have gotten used to software break from time to time. Once software breaks it is reasonable to expect it to get fix in a couple days when it is updated. At least this was probably the experience for the majority of users, those that noticed the issue.


The sad reality is Mozilla has been losing mindshare to Chrome for a long time and this will rapidly accelerate it. People don't expect things to break. They expect things to work, and when things break they get angry.

I love Firefox. It's my daily driver. It will continue to be. But this is a huge fuck-up and they're probably going to pay big in usership because of it.


That's sadly not the first time Mozilla fails to communicate appropriately about issues/changes that are pushed down to the end users. They should reshuffle some of their Marketing Resources to work on proper non-promotional communication instead, so that current users at least know what to deal with.


I'm interested in the general writeup what went wrong that they missed this certificate expiring. That's a structural problem.

Also why it took 6 hrs to assign P1 to the bug


They closed the trees (stopped merging other code changes to prioritize this) for the bug <22 minutes after it was opened.

I would assume the delay in assigning P1 is really just a result of assigning P1 not being as high priority as fixing the damn problem.


If I understand the bug report comments correctly, they didn't close the trees to other code changes to prioritize fixing this, they did it because the cert expiry broke some important tests at the same time as it broke every end user's browser.


This is one of the things about this whole episode that I find baffling. Stuff like adjusting bug priorities and arranging for someone to tweet an announcement is the work of a good engineering manager. This is the right person to run interference and handle comms and deal with things outside of the critical path, like bugzilla updates.


The priority field just plays no practical role here since the bug was immediately escalated.


If people can see it externally it definitely has an effect, "oh moz aren't taking this browser breaking bug seriously". Which is probably why the parent said it's a management issue rather than a directly technical one, per se.


Stuff like adjusting bug priorities and arranging for someone to tweet an announcement is the work of a good engineering manager

So we can come to the obvious conclusion about Mozilla, then? No good "engineering" managers? Miss one reprioritization and you're out! This is what sane people think?


At no stage did the parent post state there are no good engineering managers at Mozilla. They just said that adjusting bug priorities is the work of a good engineering manager. There's a world of difference.

If you want to complain about knee-jerk overreactions, I think you might want to look in the mirror first.


I have empathy for people in the middle of hair-on-fire incidents.


> Also why it took 6 hrs to assign P1 to the bug

Because people were staying up until the wee hours of the morning working on fixing it instead of toggling priorities in Bugzilla. This was treated as a five-alarm fire.


The normal practice would be to have someone operating as an incident communication manager who would be taking care of status/things like this.

Saying "we were too busy fixing to communicate" is actually a really bad sign, because it's not just about what you are communicating to the outside world, but also, for example, about making sure people that need to be brought in are getting consistent information.


This is an incredibly uncharitable misreading of what I said. We do have incident communication managers who were working around the clock, of course, communicating through the official channels.

The question is about why someone didn't set the priority setting in the internal Bugzilla sooner. There are a couple of reasons for this. First, that's not really what priority is for: priority is so that engineers on major projects like WebRender know what is most important to work on. It's not an effective tool for emergency responses. P1 doesn't summon on-call people. Second, everyone who was able to get it fixed was already on it. Bugzilla priorities don't make people who wouldn't otherwise be aware aware.

Unlike that of Mozilla, your issue tracker at Google for outages is private and internal-only. One of the many reasons it is private is so that people who don't know the organization's operational processes don't come in and start making incorrect assumptions based on what they find there. I think it's laudable that Mozilla works in the open so much, but comments like this one are some of the downsides of doing so.


"This is an incredibly uncharitable misreading of what I said"

I'm sorry if i offended you. Really. I don't really think it's that uncharitable, but hey, offending you wasn't my intent, and if you were, that is what matters.

For what it is worth: your response to the parent took them to task for even attempting to state something: "Because people were staying up until the wee hours of the morning working on fixing it instead of toggling priorities in Bugzilla. This was treated as a five-alarm fire."

This is a pretty rude response (which should never be happening regardless of what they wrote), and if you intended to convey that you had incident managers handling it and they didn't get around to it yet, you did not.

"Unlike that of Mozilla, your issue tracker at Google for outages is private and internal-only. "

I'm really unsure why you decided to add a completely irrelevant and unnecessary attack like this.

Honestly, it just makes me think less of you and makes me sad. You are usually a very sane and even keeled person. I'm sure you were not having a good time, but i still don't think this was okay.

" One of the many reasons it is private is so that people who don't know the organization's operational processes don't come in and start making incorrect assumptions based on what they find there."

Which is incredibly ironic, since it is not in fact as private as you seem to believe . Your own assumption here is completely and totally incorrect.

Maybe before you decide to add completely irrelevant and unnecessary attacks, at least verify they are correct?

" I think it's laudable that Mozilla works in the open so much, but comments like this one are some of the downsides of doing so."

You seem to have taken my comment incredibly personally, and your response seems very far out of proportion.

If you want to have a discussion, you're gonna have to tone this down a few notches.

FWIW: Right now most of your comments in this story read incredibly defensive (and i'm not just talking about this one)

I would stop and give it a rest. It is not projecting a good image.

Of course, that's just my perspective.

Really hope tomorrow is a better day for you.


I agree with you, it was more important to do the work than to signal.

However, I bet it’s likely they have procedures and policies for work that first involve signaling like for example the priority level.

I’d be willing to bet lots of things surrounding this issue weren’t handled in a by the book manner. So if you are always going to wing it, why have a book (or a public priority level system) at all?


First, because priority is for things like major feature work, so that engineers can find the bugs that are useful to work on. In this case, everyone in the team responsible was already spending 100% of their time addressing the issue.

Second, because we care about solving problems, not being bureaucrats.


Really? Because being the bottleneck (i.e. single point of failure) responsible for approving all addons is exactly what bureaucrats would want to do ;-)

The non-bureaucratic thing to do, as has been pointed out many times of course, would be to give users the power to override the cert signing check as an advanced option.


It is, in fact, an "advanced option", in about:config.


But that option only works in nightly builds, not Firefox release builds. If an option doesn't honour what it claims to, imho it might as well not be there.


You have the book because at one time you didn't have the book.

"The book" is something that needs to change and improve like anything else. Sometimes, that means you'll still need to wing it and add that thing to the book later.


>This was treated as a five-alarm fire.

I don't think it bothers me personally but it's funny you said that. Presumably you mean a "'no-alarm fire' because who has time to set off an alarm when there's a fire to fight"?!


> One-alarm, two-alarm, three-alarm fires, etc., are categories of fires indicating the level of response by local authorities. The term multiple-alarm is a quick way of indicating that a fire is severe and is difficult to contain.

https://en.wikipedia.org/wiki/Multiple-alarm_fire

The 5th level or a five-alarm fire, is the top level where you're basically talking all hands on deck.


In the UK they actually sound/flash alarms at locations and in the fire-station, do they not do that in USA? They do in the movies.

Still seems like an ironic choice, applies equally to the people saying it was DEFCON1. I'm pretty sure the military actually have displays indicating the status, but again that's based mainly on movies.


I'm also interested in the postmortem to explain the processes that failed to allow the certificate to expire, but let's not overdramatize the situation by nitpicking about filling in form fields on bugzilla. The fact that the tree was closed is equivalent to DEFCON-1, which is all the priority anyone needs to understand the severity of this bug.


> which is all the priority anyone needs to understand the severity of this bug.

Random user: What the fuck is a tree and why is the priority of this not higher yet?


> Random user: What the fuck is a tree and why is the priority of this not higher yet?

Not to be too glib, but any random user who is technically literate enough to know where to seek out Firefox's issue tracker and how to find the issue in question, and who has such a thorough understanding of issue trackers that they understand that such a thing as a priority field exists, is also going to be savvy enough to read the very first comment, and will be well aware of what it means, and will, one hopes, be rational enough to understand that the flurry of activity indicated by the issue in question is more important than a passing field in the bugzilla database.

If anyone expects Mozilla to take power users seriously, then we need to focus our criticism on the things that aren't just imagined trivialites. It makes me frustrated that the people who irrationally fly off the handle at the slightest perceived provocation are also the ones who implicitly encourage Mozilla to write off power users as more trouble than we're worth (and after ten years of watching these incessant whining non-comments on HN, I don't blame them anymore).


I'm also interested in why existing adds-ons are failing to run due to this problem. (There was a similar question in another thread about the issue here at HN.)

I understand why an add-on update or new installation would be prevented from succeeding by a certificate expiration. But why would a certificate expiration prevent an already-installed from running? Any already-installed add-ons were previously validated at installation time and should (IMO) run as-is. It seems unnecessary to continuously check the status of an add-on's certificate if it has not been changed. Am I missing something?


When a certificate is no longer valid, the authority it represents expires too. Grandfathering trust in various places would make cert management even more difficult to get right, because there'd be no hard deadline when a certificate is no longer in force.


It's already a mess. I disabled the signing/recompiled Firefox, and all my extensions were still force disabled with no UI to enable them. So there's some memory/extension state there already.

I had to go thorugh profile/extensions.json and set appDisabled to false to make my extensions enableable again.


But that represents how people consider trust when choosing addons. It's trusting the code and company at the time of install, not at an arbitrary later time. Sure, if the cert expires and there's an update then the user wants to know.


You cannot rely on “check at install time.” An extension could be installed by a crapware installer behind FF’s back. You can’t go and remember the trust state at install time either, because that memory would need to be kept locally and could be modified by a crapware installer. So the only solution that prevents circumventing the check is to check the signature when the extension is loaded.


The main trust check is at installation time, but it's possible for problems to be discovered later, and Mozilla needs to be able to do something about it. Certificate non-renewal is the only robust avenue of revocation.


There's already an extension banlist.

See profile/blocklist-addons.json

Not sure how not renewing a certificate and letting everything get disabled is useful. It's only useful if cert key leaks.


> Mozilla needs to be able to do something about it.

No, they do not need to. They decided that they want to. Remember that there was a time before certificate signing.

And now the decision to be able protect those who install crapware is also harming those who never had those issues.


They should absolutely have asked extra permission to implement a system where they could choose to alter my browser install, in an unexpected way, at their behest without seeking further authorisation, not even a modal???

It's exceedingly poor ethics.


I believe it's set up this way specifically to allow revocation for addons that are initially approved but later found to be malicious.


If it is, without requesting user authorisation, then that's an illegal act under the UK Computer Misuse Act (and the USA's CFAA I think too) - modification of a computer without authorisation.


except you agreed and authorized when you installed the software. Take your position to the logical extreme - software can't make any changes without explicit, interactive approval; and you thought UAC was bad.

I look forward to joining your class-action lawsuit.


When the changes are unexpected, yes, further explicit authorisation is required. Just because you installed a photo-album app doesn't let the distributor delete all your photos, say.

Besides that, this sort of "but we hid something in the t&c-s so now we can shit on you" is the sort of thing I expect from over commercialised companies, not from what was once a paragon of the FOSS community.

FWIW class-actions don't exist in UK.


How do you know if it has not been changed?


Signing just verifies the .xpi file. If the addon has managed to get enough permissions to modify it's .xpi file it can bypass the signing requirement in various ways.

Revocation is, I think, the actual reason.


Looking at the changeset [1], I'm curious why the explicit check for expiry (line 644/646) didn't work. Unfortunately the mentioned bug is rather light on details; presumably they were collaborating on IRC or something instead.

[1] GitHub mirror to not stress their infra: https://github.com/mozilla/gecko-dev/commit/1d1260c7615f1d9a...


I know Firefox isn't being malicious, but ugh, this seems like the worst possible PR move for this, optics wise. "Hey so uh, we accidentally broke your browser, so you need to opt-in to becoming a guinney pig. But don't worry! You probably were already opted in anyway and just didn't realize it! Also it might take six hours to work."


So that's pretty unfair. 1) They state they are working on a fix for normal, release channel users who don't want to run studies 2) they tell you to temporarily run studies to get the fix within up to 6 six hours (could be faster; set expectation) 3) You can explicitly install nightly or 66.4 before it's pushed if you want a fix now

Yes, it's unfortunate, I'd expect them to meet it head on, push a tested fix in a timely matter, admit a mistake was made, explain publicly how/why and apply learning moving forward. Beyond that, what's your expectation?


Not saying that their current actions are wrong, just that the optics of it are terrible for them.

There was a chain of bad decisions that led them here though: 1) thinking it's ok to disable software after its installed (using cert expiration -- I'm ok if the cert was revoked but that's a totally different discussion), 2) Taking more control of people's local software than many people are comfortable with, especially considering that their main market is tech savvy people that tend to be more sensitive to this than most 3) Making some of these things opt-out rather than opt-in, giving the perception that they may value data collection and control more than their users privacy.


For what it's worth, the (initial) mechanism for disabling add-ons (your 1) has been present since before Firefox 1.0. It was designed to quickly deactivate any malicious add-on as soon as it was detected, before it had a chance to do too much damage. In my books, that's a good thing.

Here, the mechanism that kicked in was the protection against add-ons that could have been signed with stolen credentials, which would make them clearly malicious.

Of course, it turns out that the problem was an expired cert, so a bug/human error. But generally speaking, I think that 1 is good.


> It was designed to quickly deactivate any malicious add-on as soon as it was detected, before it had a chance to do too much damage. In my books, that's a good thing.

I hate this attitude from security people so much. If for the sake of fighting malicious code you are crippling the software usability or my user experience, you are the malicious code.


I hate that attitude from entitled users so much. If you don't want security, you're welcome to have a malware-ridden system, but don't think that this means all users should have to put up with malware-ridden systems.


I wish that was true, but in fact I have no way to disable this and similiar amazing security entrenchments. The monthly device bricking windows updates, for instance.

If I can't do anything with my hardened computer, I don't care if is eaten alive by malware, it is useless anyways.

At work, as the guy who have to fight on behalf of the sysadmins and the users dozens of clueless security advisors who are hardening everything according to security best-practices written by similarily clueless experts, I'm seriously astonished by the common backward thinking. If you are blocking access to all users pdf files, for an instance, you are the malware, you are causing disturbance to the business operation and annoying everyone.


This petulant antagonism ("You are the malware!", "No YOU ARE!") between users and security is contrary to everyone's interests.

Go sit in separate corners, both of you. Think really, really hard about how both of your jobs are critical to the long-term success of the business. Don't come back until you've meaningfully internalized that.


I'm having hard time understanding what are you referring to. All I'm saying is that I had a perfectly working system and now it is no more functioning properly. Why would anyone see this as more secure is beyond me.

Similarily, when "security best-practices" are leading to hundreds of my users losing SSO access to their BI system, or hundreds of printers rendered unusable because of security update that requires administrative permissions for reinstalling the same drivers that were perfectly working so far, I don't care for your security benefits. They suppose to defend against the thing that you are causing. You are already damaging the organization with thousands of working hours lost, and everyone is frustrated, as a bonus.


> If you don't want security, you're welcome to have a malware-ridden system

No, I am apparently not. Microsoft, Apple, and others insist on making it difficult. At least I found out today that I can install unsigned Firefox extensions once I switch to a special "unbranded" build. I'm glad Mozilla, at least, still offers that.

Here's the thing: I disable a lot of the security stuff you're not supposed to disable, when I can. I use a Jailbroken iPhone. My Mac has SIP and Gatekeeper turned off. Windows Defender is turned off on my gaming PC, and I lower Microsoft's driver signing requirements to the greatest extent allowed. I also ran an unpatched day-1 build of Windows 10 for around four years, with the autoupdate system forcibly neutered. (I now run LTSB, instead.)

I have never been bitten by a virus, ever†. I don't know if that's because of all the security measures I'm not able to turn off or because I've been lucky or something else. I suspect it's because I don't run dodgy software. Or maybe my life is a lie and all my devices have been infected for the past decade, and I never noticed.

In the meantime, I'm not seeing the upside to software forcing hardened security.

---

P.S. While I share their frustrations, I don't endorse the GP's attitude. I know that a lot of people really are doing difficult work with the best of intentions.

† Except for a handful of times when I was testing suspicious software in a disposable VM. That doesn't count for obvious reasons.


Just because you haven't been affected by a virus doesn't mean you never will. For example, the patch for the zero-day exploited by WannaCry was sent over windows update a few months before WannaCry existed. I personally use Linux, so Windows Update doesn't exactly exist for me, but I still update my system whenever such updates are available. Both SIP and driver signing are both mechanisms to prevent the installation of rootkits. If you do get a virus, such mechanisms would prevent it from hiding itself or causing more damage to the system.

Not running dodgy software is indeed a very effective way to not get viruses, but that doesn't mean you shouldn't take more security precautions if they are available. What if, for example, malware exploits a zero-day in your browser and successfully installs itself without any interaction from the user? Windows Defender and related antivirus could detect malicious activity from such malware and remove it on windows and Gatekeeper/SIP/driver signing and related systems could severely limit its impact.

Mozilla probably introduced extension signing to prevent less technically inclined users from having adware installed into their browser. X.509 introduces certificate expiration, and X.509 is the most widely used mechanism for signing things. The only reason you have this problem is because the folks at Mozilla forgot to renew that intermediate certificate. While I agree that it should be possible for users to disable extension signing, as users should be able to do whatever they want on their own system, you shouldn't blame them for forgetting to renew a certificate associated with an additional security measure built to protect users from malware.


> The only reason you have this problem is because the folks at Mozilla forgot to renew that intermediate certificate. While I agree that it should be possible for users to disable extension signing, as users should be able to do whatever they want on their own system, you shouldn't blame them for forgetting to renew a certificate associated with an additional security measure built to protect users from malware.

Well, we agree on everything, in that case. :) I have no problem with sensible defaults that can be adjusted, nor do I take great issue with Mozilla's specific mistake in letting the certs expire.

Edit: Also, just noting that I only lessen security measures when I have a reason, not because I'm rebelling against security practices or something. I use a Jailbroken iPhone because I run Jailbroken software, and I disable driver signing because I use unsigned drivers. It's not that I don't see the risks, so much as I'm very unconvinced the risks are worth the downsides, for me.


This is quiet funny, because you are 100% infected with a virus. How would you know that you are not?

Malware that does obvious bad things like cripple the machine or encrypt the drive is minority.

But it's actually not funny, it's dangerous. This behavior endangers others.

Likely your computers have been a zombie in a botnet for many years - I'm always wondering how these large botnets can exist at such scale.

This is why one day we all will not be allowed to use computers without a "driving license".


Sorry, but where is the difference from a secure-system that allows central control - and a male-ware backdoored system?

All that is diffrent is the promise of non-maliciousness. Which often does not hold up. Cause money is corrosive to those little centralized empires of "all-can-fail-but-me".

Security is diversity, as in having a non-centrally controllable ecosystem, that is not a mono-culture. Your updates are the danger, your urge for control is the forrest fire.

Linux is not secure because its updated often. As package maintainer take-overs have show- that is even a vector. Its secure, because its fragmented into a thousand small populations, which offer no real financially interesting attack vector for a large scale take over.


Linux's diversity also makes it difficult to package and distribute non-malicious software, so I'm not sure that's the poster child for how to do security without compromising usability.

(The situation is admittedly getting better with Flatpak.)


Say what now? Users are entitled for being in control of their own systems? Are you crazy?


It's just the IBM mainframe priesthood reasserting its dominance, lurching from the tomb of computing history to save us all from ourselves. Nothing to see here.


So a security feature that had 0 impact for decades is "crippling" the usability of a project due to one outage?


I use container tabs extensively.

Today, more than half of my open tabs disappeared in an instant, and were not even an option to re-open until either I waited around ("up to six hours...") or manually installed the workaround. All of my in-progress work in any of those tabs? Gone.

That absolutely qualifies as crippled usability. The mere fact of such a thing being possible is a usability defect. On what basis do I trust that my work is not going to disappear on me like that again?


This is absurdly overdramatic. One issue with a feature in decades and you're stating that you've lost all trust in the browser.

https://bugs.chromium.org/p/chromium/issues/detail?id=952287

Chrome has bugs too.

Firefox will continue to have bugs. All software will continue to have bugs. I'm so sorry that you lost some tabs in your browser but shit really does happen and acting like this is some violation due to overzealous security controls is inane.


I did not say "all trust". Please don't presume to inflate my explicitly stated position — especially while also minimizing the impact this incident had on me, and others. I did not merely "lose some tabs"; those, I could just re-open. I lost work. That data, effort, and time are gone.

If you think this clownshoery hasn't cost Firefox any trust, then you're being as naïve as you accuse me of being "absurdly overdramatic" and "inane".

Bugs are a thing, totally conceded. Sloppy certificate management is, too, but it's an entirely other class of thing. Deliberately conflating them is at least as disingenuous a debating tactic as pointing at Chrome, which is utterly irrelevant to this incident. That's straight-up "whataboutism".

Full stop, this was foreseeable. This was preventable.

EDIT: Phrasing.

EDIT 2: I won't respond further to the same kind of tone.


Alright, I apologize for the tone. It's unnecessary to make something like this into a heated discussion.

That said, the part I was referring to is:

> The mere fact of such a thing being possible is a usability defect. On what basis do I trust that my work is not going to disappear on me like that again?

The possibility of a bug happening is hardly a usability defect in my mind. Or if you want to call it one, it seems like a perfectly reasonable one - this was a defense born out of necessity when malicious extensions were more of a problem.

And I think that the "On what basis" question definitely implies a total lack of trust, but sure, maybe not. The basis is that this is a single instance of a failure over the course of the features' lifetime, for a feature that has existed for absolutely ages.

I pointed to Chrome as an example of similar issues cropping up across codebases to show that these sorts of bugs do happen. I don't consider that whataboutism.

All bugs are foreseeable and preventable. Systems are complex. I think you're putting the issue in a very unfair light, even though it's very reasonable to be upset about time and effort that is lost because of the issue.


First, thank you for responding in a manner that invites a response, rather than demands refutation.

I understand your perspective, and appreciate your recognition of my own. That said, if you think I'm putting the situation in an unfair light, I think you're downplaying it at least as much.

In my eyes, this is no mere "bug"; it's an abject process failure. As a reply to another of my comments in this discussion suggests, this is more on the level of, "Oops, we forgot to renew our domain name...", than it is, "Gosh, we didn't validate the pointer returned by the frobnitz function, when the whoozle isn't initialized yet..."

Dealing with expiring certificates before they expire is covered in like the second week of Certificate Management 101, as it were. If it's necessary to stick an intermediate cert in there, then it's doubly so to keep it current.

> The basis is that this is a single instance of a failure over the course of the features' lifetime, for a feature that has existed for absolutely ages.

The plural of "anecdote" isn't "data", but an existence proof is an existence proof. That the problem has gone from zero occurrences to one, no matter over what period, literally makes it infinitely more likely to recur, if you want to be that reductive...


Being a former ops guy the items you list resonate with me. On the one hand I do feel for the developers and hope they come up with a fix soon. On the other side, this is frustrating and there were some bad decisions made that a typical ops person would have pointed out and been ignored. The ignoring of ops guys until something breaks is something that has been consistent in my experience. Anyway for the sake of having an alternative to Chrome I hope they fix this yesterday.


> their main market is tech savvy people that tend to be more sensitive to this than most

Is that so, though? Firefox is still being used by millions of users, and I doubt those are only the tech savvy internet users.

(Then again, this mostly applies to Firefox users using add-ons, which probably has a higher share of technical users.)


One of their biggest 'selling points' is that they protect your privacy. It's really, really off brand for them to be distributing a critical bugfix through a telemetry collection channel.


it does feel like the normalization of deviance

it's also entirely predictable that a non-negligable fraction of users -after enabling studies and verifying everything works again- will ... simply go on with their lives and forget about disabling studies...

I also don't understand why the certificate graph is not exposed through a user interface, so that the user can add and remove certificates, or enable and disable certificates at their own discretion. This should have been obvious when the certified add-ons were introduced. Then all they would have to do is host the certificate file on their own domain and everyone could follow the simple steps in the GUI to replace the expired certificate...


I might be wrong, I don't have data, but, as far as I can tell most users either use the installed browser or chrome (or whatever their tech savvy friends/relatives install for them).


> using cert expiration -- I'm ok if the cert was revoked but that's a totally different discussion

CAs can delete certificates from their revocation lists after expiration, which means that you can't tell the difference between a certificate that was never revoked but merely expired and a revoked-and-then-expired certificate.


As an alternative perspective, I'm totally fine with FF disabling the extensions when the cert went invalid, and I'm also happy that it auto-updated itself to fix the issue. To me the optics are pretty good: a mistake happened and they were able to recover pretty fast, and my browser wasn't exploited by bad actors in the meantime.


> a mistake happened and they were able to recover pretty fast, and my browser wasn't exploited by bad actors in the meantime.

To me adding a new plugin signing cert through a side loaded plugin is pretty much the definition of exploited.

All this tells me is that their plugin signing solution is utterly worthless.

The only way this should have been fixed was through official update channels.


The worst part is taking control because they think they know better than the user, and then messing up like this.


What does optics mean in this context?


Public perception. For instance, one of the first comments on their post is this:

> Why not just post a link to the fix that can be installed WITHOUT enabling Studies? This sounds like a clever plan to get more people to share their data via Studies…

I definitely don't agree with that guy, and I doubt that's a majority opinion, but asking people to use a workaround that benefits them (Mozilla) after they broke things for a lot of people is bad publicity for sure. For what it's worth I think Mozilla is doing the right thing here, just it's not going to make them look great.


I'm not sure I care how unfair the characterization is. I heavily use container tabs — ahem, 'usecontainers — and all of my open container tabs disappeared at once, with no indication of why or what to do about it, when this happened. I lost an absurd amount of work and state because of that. I only knew what caused it by inference, because I'd just previously read The Fine Article (which, btw, gave no indication that losing state like that was something I should expect, merely, "No active steps need to be taken to make add-ons work again"...)

I still prefer Firefox over all the other browsers, and will continue to use it, but the project has lost a lot of trust and goodwill over this.

The optics are indeed awful, and this was fully preventable. Firefox fucked up, full stop.


This is pretty much exactly my thought as well.

Based on the timing of initial tweets and blog posts on this fiasco, I'm pretty sure I was in the first 10%, if not first 1%, of people who experienced this. And I was in a plane at 36,000 feet trying to work on a cross country (U.S.) flight when suddenly about 130 tabs in 7 windows disappeared. Really, REALLY bad. Panic, frustration, confusion...

I was more than 50% sure that all was not lost forever, that it was some "glitch" (Extensions all showed the same bloody red status), but I was tweaked. I work in security (embedded systems, not computers/IT) so I have a very good understanding of certificates, TLS, PKI, etc. There are many ways things can get out of whack if the people in charge screw up.

Regardless, this is embarrassing, dare I say shameful (pretty much almost up there with "Ooooppsss... we just lost our domain - it expired and no one thought to renew it)

Come on, guys, get it together. Have a procedure, document it, practice it, stay in front of it.


> I lost an absurd amount of work and state

EDIT: after installing the fixed XPI, I have to sadly report that all data has gone. All my carefully-managed containerized life was wiped clean. Heads should roll.

Complete shambles. And the worst thing is, I suspect it's all a plot to have more people opt-in to the shitty telemetry. Otherwise, why not push an update through the usual channels? Had it been a security-related fix, would have they used "studies"? I bet not.


Locate your profile and look for session backups -- your data might still be there.


My expectation is that something like this doesn’t happen in the first place.

Given that it has happened, I expect them to provision a new certificate and push a fixed version within an hour or two to all release channels.

What I would emphatically ‘not’ expect, is a hack that might take up to 6 hours to be applied.


How do you get these studies or beta version?


It is too late to listen to reason. Many commenters have spent their Saturday morning pushing a narrative that appeals to emotion.


What narrative could there be other than "Goddamn it Cert Guy at Mozilla, you had one job?"


[flagged]


Switch to what? Chrome? Because you don't like having to re-opt-in to studies? That would be ludicrous given Google's privacy track record. Opera? They're owned by a Chinese investment firm now. Edge? MS's whole OS is based on data collection.


Typing this from a new Brave install. Just switched from Firefox after their handling of this.


Have they stopped whitelisting Facebook and Twitter in their "tracking blocker" yet?

The BS coming from their blog post surrounding this whitelist makes me distrust them completely: "Loading a script from an edge-cache does not track a user without third-party cookies or equivalent browser-local storage" (...) "Given that most users on the web share IP addresses with other users because of NAT, it is unlikely this can be used to reliably track users"

Not only it's quite possible to know if the user is behind CGNAT or not, meaning the tracking works just fine for millions of users, but carriers have been known to inject user IDs in the replies of users behind CGNAT.


Apparently not. Would there be a way via settings to just explicitly blacklist?


The handling, or the bug itself? Sound like the damage control is fine (although worrying that they have no way to distribute hotfixes more rapidly than this).

The bug in the first place, on the other hand, seems pretty negligent. Not that it's incomprehensible, just pretty stupid.

Anyhow, good luck with brave!


The bug in the first place. That we can't easily rollback this "upgrade" as well.


[flagged]


Lynx, Midori, Safari, IE6. Maybe Palemoon, Samsung browser is another possibility. Maxthon was popular for a bit. Maybe you're super 1337 and run in only curl or wget.


He has moved to Lynx.


CVE-2016-9179 was published about Lynx in November 2016. Lynx took more than 5 minutes to release an update, with the fix included [1] in 2.8.9dev.11 not reaching a production release until July 2018 — almost two years after the CVE was published.

With a response time like that, I don't see how Lynx will satisfy their "5 minute fix" need any more than Firefox did.

[1] https://lynx.invisible-island.net/current/CHANGES


I'm curious, do you switch at every fuck up?

Then it's only a matter of time until you come back to Firefox, or maybe you'll end up making your own web browser?


And operating system, and smartphone, and processor, and video card.


[flagged]


Or maybe they take offence at your overall tone? You're clearly not interested in being constructive.

This kind of toxicity is exactly why many of us have "uninstalled" Reddit and the like.

Please refrain from bringing that toxicity to HN.


“Many commenters have spent their Saturday morning pushing a narrative that appeals to emotion.”

“Toxicity.”

Goes both ways.


Could you please stop posting unsubstantive comments to Hacker News?


> It is too late to listen to reason. Many commenters have spent their Saturday morning pushing a narrative that appeals to emotion.

Your comment, my comment, that comment, none of them are substantive. Why isn't that comment removed yet?


It's a matter of degree. Your comments went an order of magnitude further over the line.


I am positive my first comment on this thread was not an order of magnitude over the line, if we can quantifiably measure such a thing, and it got censored anyway. "I just switched my browser. Bye bye Firefox."

I mean if you want me to be reflective it's really not going to work if we're refusing to admit that either both of these comments should have been censored or the one above should not have been censored. But I get it, life's not fair and the squeaky wheel gets the grease.

Firefox zealots get to railroad commenters on HN, noted, I certainly won't engage in these discussions again.


I can't consider only your first comment when you posted nine of them.


[flagged]


No, we’re not flagging you for your use of swear words; we’re flagging you for your lack of construction to the conversation. You clearly are not interested in having a conversation


To what?


They only point out the opt-in instructions for the few people that voluntarily opt out of Shield studies and wish to get the fix sooner.

Most Firefox users have that checkbox enabled by default, and so most Firefox users received the fix within 0-6 hours of the blog post's publication.

HN readers often take special care to prevent Mozilla from updating Firefox, but that in no way represents the wider population of either all addons users or all Firefox users.


This has nothing to do with "studies" and should be pushed as a general fix. There is just no excuse. What if this had been a security incident?


Then it seems it’s a good thing they had a system in place that could deliver the quick fix in less than a day, on a Friday evening, when shipping a normal fix could have — is — taking longer to ship than that quick fix did.


> it’s a good thing they had a system in place that could deliver the quick fix

No, it's not. "Studies" is not a security-related mechanism and it didn't exist in the past, when fixes were rolled out very quickly anyway for security reasons. "Studies" should not be relied on to be a fix-delivery mechanism, because it just isn't.

This is not even about privacy, it's simple good engineering sense.


It seems like your objection to the Normandy system is that the UX surrounding it includes the word “Studies”. I am grateful they chose in this instance to prioritize repairing addons worldwide over the confusion that word has caused you and potentially others. I assume, having seen this and other such comments delivered with outrage rather than thankfulness today, that they will re-evaluate the UX surrounding the Normandy system to ensure that it more clearly designates non-study changes as such.


It's not about designation, it's about control. If Mozilla really cares about trust, they shouldn't mix their update delivery system, which should care for timely security-related material, with general telemetry, data-gathering, and experiments.

I use FF because I care about principles. Otherwise I might as well just let myself be exploited by Google, MS, Apple and friends.


Ah, you object to Normandy’s design in some manner. That’s being hashed out in today’s Normandy thread, and if you haven’t already read that link you’ll definitely want to:

https://news.ycombinator.com/item?id=19825830

(While of course you’re welcome to continue pursuing your issues with it here, your opinions will receive more views there.)


Why does it take longer to ship a general fix / update? They don’t do a full regression for the studies fix? Update mechanism doesn’t check for updates as often? I couldn’t find any information on this yet but would love to know.


“How long does it take to build, unit test, performance test, and QA check a new Firefox release on every supported release of macOS, Windows, and Linux platform?” is absolutely a question that outraged users are trying not to confront. You’re right to ask it, so don’t let the downvotes get you down.

Presumably the testing burden for a preference update using Normandy is smaller, as (and I’m guessing wildly here) fewer things can be altered with Normandy and therefore testing can be simplified to exclude, for made-up example, “the code-signed binary can be executed on all platforms”.


Sounds reasonable to me. Would love too see that information on the official Mozilla blog for the post-mortem. I personally think it is great to have a mechanism to push fixes quickly - whatever the name is. I just don't understand why this mechanism can't be the regular update mechanism.


On Debian the distributed ESR has that option greyed out anyways. Maybe we’ll have to wait for maintainers to push an update?


I am sure an update will be pushed quickly. I'm waiting too, as a testing user.

You can fix this temporarily via setting "xpinstall.signatures.required" to false. Toggle it back to true once update is released and you install it.

Meanwhile I'm hijacking this comment that is to the upper parts of the tree to state this: the way the community treats Mozilla and Firefox is horribly, inexplicably, unacceptably unfair.

This is nothing compared to innumerable other fuckups in software history, and even recent ones like goto fail, heartbleed, or Chrome logging you into Sync w/o notice.

This is a mistake, an easily recoverable one, and is not intentional or malicious. Firefox is developed in the out and open, all the processes are public. And people, with an absurd entitlement and malice, go as far as to call things backdoors or malware. Meanwhile the alternative actually is a backdoor ridden malware.

Please don't be this ungrateful.


> This is a mistake, an easily recoverable one, and is not intentional or malicious.

While I agree with most of your comment, you're downplaying the severity here, especially since, IIUC, this situation also affected the Tor browser, disabling NoScript. If regimes like China were on the ball, and succeeded in escalating the remote code execution vulnerability into into deanonimization, this debacle may end up having a death toll attached to it.


An article that mentions a timeframe of “the next few hours”, but doesn't have any timestamp besides a date without a timezone.


If Mozilla Studies are implemented the way other "push" update systems are, then probably your browser has an ID that it hashes to get a bucket ID that it builds into a URL to check for updates, plus a cron time offset for running those checks. Then, the experiments are rolled out by walking up the bucket ID list and gradually adding the addon to said buckets.

Usually, this mechanism is explained as being helpful to ensure a rollout of an experimental update can be rolled back if it's failing. That's not so much a concern in this case, I think. But this mechanism has another effect: it works as a solution to the thundering-herd problem. Every browser updating at once is bad, not just for Mozilla's servers, but for every piece of Internet infrastructure that those browsers (and their arbitrary set of addons) talk to when they update/restart. Within the time budget you have for running a rolling update, you ideally want as few machines updating concurrently as possible, just because you don't want to generate mysterious correlated traffic bursts that make NOCs paranoid.


We’ve always called this Canary Updates.

You go a bit in, wait to see if the canary, then go further.


You can hover the date to see the full timestamp, it's 2019-05-04 07:01:35 UTC-7.


It's an annoying UX antipattern, the only thing worse being not adding the on-hover text at all. Displaying date and time should be the default.

HN has this problem too.


Not available on all clients, notably mobile.


Actually, if I'm not mistaken, the complete timestamp is available when you hover over the date.


Poor communication with zero accountability. A few hours could be 2, it could be 12.

“I (name the individual accountable) will give you an update at 12:00 PT (name a time) as an update to this post (name the communication channel) with the current status and latest information on this issue (don’t promise time to resolution, just time to info).”

Simple, clear, concrete, and unambiguous. I had hoped that Firefox had better communication procedures in the event of global-impact P1 issues.


Granted I'm using Nightly (and previously disabled extension signing in about:config), but now all my themes are disabled, even the default one apparently, though that is what it is using. Cannot be re-enabled. When do I get my dark theme back?

Also...my default search engine is now Amazon.com?? WTF is going on.

EDIT: Also my only search engine. Heck of a job Mozilla.


Bezos and his strategy to name the company something starting with A so it always comes first finally pays off.


Now I understand why Google renamed itself Alphabet. One step ahead, literally.


Have you been able to restore your search providers? Really a kick in the nuts, this one..


Nope. Maybe I should restart Nightly again but...I just added Google back, and it was not as easy as I'd expect. The Firefox "add search engines" page is a mess and straight Google search was like 3 pages down.

Also I found another dark theme that is somewhat similar to what I had (but not official Mozilla) and installed that. All the default ones are still disabled.


Thanks.. Same here on every machine.. The hotfix update has been installed and add-on functionality is indeed recovered, but still i only have "amazon.com" as a pre-installed search provider and have been unable to restore them to their default settings without creating a completely new profile. Besides, i would be rather hesitant to install search providers from that list, imho.


For me (repository firefox on ArchLinux), the temporary fix was setting devtools.chrome.enabled = true in about:config, and running this small JavaScript snippet in Chrome DevTools (Ctrl+Shift+J):

https://wiki.archlinux.org/index.php/Firefox#Firefox_disable...

AFAIK, this will enable all the disabled add-ons until the next check, which is in 24 hours. This will be hopefully enough time for Mozilla to release a stable channel update, instead of the "Studies hack".

At least for me, fiddling with the Studies settings had no effect; the about:studies page remained empty regardless of what I did. I've also seen multiple reports from people who got the Studies hack working that the fix actually failed to address the issue properly.


On Linux you should also have the option of disabling the cert signature check. Go into about:config and set `xpinstall.signatures.required` to false. Just remember to set it back to true once this is fixed.


It appears to take several minutes for the studies to be downloaded and enabled. The process worked for me on two separate Windows 10 machines.


Very curious how the decision to use the Studies program happened. Why not just roll the version early and include the fix in the new version - isn't Firefox an evergreen browser now? Maybe there is extra bureaucracy to roll a new version, or the hotfixers didn't have permission to do it. Either of which I can understand, being that they made the fix late on a Friday night - so huge kudos to those who worked hard to get this fix started. Seems like a good case study for lessons learned here, I'm eagerly anticipating the postmortem and follow-ups.


Doing a full release takes a lot longer, so they presumably decided to use a faster method where possible. https://hacks.mozilla.org/2018/03/shipping-a-security-update...


Certificate expirations show up on causes of outage lists so frequently, yet little has been done to address the underlying issues of how PKI works to address it. The core issue of time limits on trust and no specs and requirements on how to handle the most common case of saying “yes, this is still trusted” is oddly absent.

Could we have for example a publicly verifiable ledger that can be used to verify a cert chain with not only a defined workflow to answer if a cert is still trusted but also a requirement for the workflow to be fully implemented? Seems quite doable, vs sort of hacks of auto-renew which are hit and miss depending on the CA.

In other words, when do we fix the sport rather than the players here?


Interesting. Sadly, I imagine many users will have studies disabled since the Mr. Robot incident. I've re-enabled it but there does not appear to be a way to force it to check for updates. Guess it will show up in the next 6 hours.


You can set app.normandy.first_run to true and then restart Firefox.


I've seen the same tip elsewhere and tried, but didn't work for me.

What did (seem to) help was setting app.normandy.run_interval_seconds to a small value (21). At least just a couple of seconds after I did, all my addons came back.

Edit: plugins -> addons


Remember to put it back to normal after!


Thanks, that worked!

Out of curiosity, how did you know that? What does this actually do?


I think it just triggers firefox to check for any new studies on the next launch

I stumbled upon this myself earlier


I just checked my preferences to make sure it's still disabled.


FWIW, they say you can disable studies as soon as the hotfix is loaded. Seems to work for me.


The most frustrating part of this for me is there is no (relatively) easy way to override this behavior. Its fine to disable the addons, but please allow me to "understand the risks" and continue against Firefox's recommendations.

The feeling of no control over my web browser was why I left Chrome in the first place.


On Android I get this:

>We rolled out a hotfix that re-enables affected add-ons. The fix will be automatically applied in the background within the next few hours. For more details, please check out the update at https://support.mozilla.org/en-US/kb/add-ons-failing-install...

Which is like "we did something we shouldn't have causing unauthorised changes to your computer, so we're going to make unauthorised changes to fix it".

Quite telling is that this is supposed to protect us from other developers. On the add-on screen "Enable" is greyed out, there's no "Enable even though Mozilla doesn't like it".

The UX is just like the "fuck you this computer isn't yours it belongs to Microsoft and we'll do what we like with it" that I thought I'd left in the past decades ago.

It's not your computer Mozilla, you fucked it up, you don't get to mess around with it without asking the owner.

My understanding is that this is literally illegal in the UK.

Mozilla barely had any trust left to burn IMO but they sure went all out.


It's worth reading the comments on the original HN article about Firefox extension signing becoming mandatory. Some of them are eerily prescient:

https://news.ycombinator.com/item?id=10038999


If Mozilla had barely any trust left than the industry as a whole is truly fucked.


I could say that for reliable software like SQLite or curl.

But for Firefox? My expectations for browsers in general aren't anywhere high enough to warrant raised eyebrows even in the face of monumental fuckups like the one were seeing today. Specially because users tried to warn that this could happen and Mozilla stubbornly said NO: https://news.ycombinator.com/item?id=10038999


As I understand it, you agree to the terms of Studies as part of the ToS agreed to on installation. You can disable it later. And--while it was a ridiculous mistake--they didn't make any "unauthorized changes" to your computer. They just let a certificate expire and your computer, running the same code it always had, stopped trusting it.


Hiding behind ToS is ridiculous. Nobody reads them and a moral company should never assume that because its in the ToS they actually have informed consent.


At what point is it your responsibility to verify that something you're using is keeping it's end of the bargain? Ceding all responsibility doesn't seem like the answer either. While forced arbitration and other crappy things in contracts suck, a company protecting itself against explicitly stupid or bad behavior seems reasonable. While it's reasonable to expect a consumer to understand hot coffee is hot, its unreasonable to say that a customer who has purchased coffee that gives them second degree burns had reasonable understanding of the risks. I think the only real solution would be putting customers through plain English video presentations of what rights are present or excluded for every piece of software. This would be terribly inconvenient, however it would make sure people only installed software they trusted and actually needed.


> Ceding all responsibility doesn't seem like the answer either.

Whether I cede my responsibility or not does not excuse a company from acting morally or not. You can't act immorally and say its okay because your users have given informed consent, when you know for a fact that they don't. Studies show that it would take over a year for an average user to read all the EULAs that they agree to in a year. It is literally impossible for an average user to read all the agreements they "agree" to in the average case of a computer user.

> I think the only real solution...

I'm not sure that this problem needs a solution exactly. The status quo is fairly reasonable, companies are expected to act morally and when they don't they get called out in social media and people move on to other companies. The primary threat to this model working is when overly pedantic people on HN and other sites excuse companies actions by saying things like "Well actually you _did_ agree to it by the EULA, so you shouldn't be suprised that they took your first born child, next time pay more attention to what you agree to!"


[flagged]


Define socialism and let me know how economic theory impacts civil liberties. The US has had socialist programs for a long time and hasn't devolved into madness. If I recall correctly it's been the conservative Congress that has stripped protections by establishing "free speech zones", they pushed through the Patriot Act which stripped more civil liberties, they expanded the surveillance state, they removed civil liberties from everyone within 100 miles of the border to include airports. But you're right a business being told it can't discriminate against people when baking a cake is the absolute worst possible outcome. The cake maker had their artistic rights upheld by the courts as well, so I'm not sure what you're particularly upset about. Those other things I mentioned are still ongoing. I think perhaps you would benefit from adjusting your perspective and spending your valuable time where it actually could be impactful and useful to yourself and other citizens.


> My understanding is that this is literally illegal in the UK.

What about this is against UK law?


Can we take a moment and consider the side effects?

This is a once in a lifetime chance for Google & Co. to get a glimpse of all those sly fuckers hiding behind adblockers.

This effectively uncloaked a very specific subset of Internet users and exposed them to the very companies that they've been actively trying to avoid. Not just those who avoid Chrome, but those who take extra steps to explicitly evade the tracking.

Surely Mozilla, the privacy advocate, must understand the impact of this fuck up, and yet the offered "fix" doesn't even mention a one-click .xpi install, but rather asks to enable a mechanism that, if left enabled, will grant unnecessary control to Mozilla over people installs.

This ain't right.


For me at least, it was pleasantly surprising that Google either has no inventory or just no clue who I am, as when ublock got disabled by this bug, YouTube started presenting me with ads for cars in mostly Japanese, with prices in Yen and "Singapore stock also available".

I guess because I watched anime videos?


It would be interesting to see what they show for me, but I'm avoiding surfing websites other than HN until it gets fixed (FF for Android).


You can disable the signature verification on Firefox for Android, using about:config and searching for the flag named 'xpinstall.signatures.required'.

Perhaps leave yourself a note to change it back once an update ships : )


FWIW I think it's pretty easy to test if someone is using an adblocker anyway. (I see sites do the "It looks like you're using an adblocker" thing all the time). I don't know if there's any realistic way to entirely hide that.


That's a bit different though- generally speaking those messages show up when the javascript tracker can't talk to the server it's communicating with. Even though it's "detecting" the adblock it isn't able to send information back from the client about it.


> Even though it's "detecting" the adblock it isn't able to send information back from the client about it.

Sure they can, they can just send back a resource request. It could even be for like an image with a query string attached with it, it doesn't have to be an ajax request necessarily.


The ad blockers are smart enough to detect and block that though.


When disaster kicked in, my Firefox prominently displayed a yellow bar informing me that addons had been disabled due to [whatever].

So I wasn't exactly left unaware that Bad Stuff had happened. I could - and did - shut the thing down, apply a fix, and be back up running normally within a few minutes.

My particular brand of fix: https://wiki.archlinux.org/index.php/Firefox#Firefox_disable...


I can't help but consider the tinfoil hat aspects of this matter. I would like to learn more about the sequence of events that lead to this snafu. Would an actor know that by making "an error" at a given point in time, there would be a deterministic window of time in the future where Firefox users worldwide would be affected by the consequences.


As much as I love my tinfoil hat, I can't see any upside for mozilla in this? If it's to get people to opt in to "studies", they already have most people sending them telemetry, apparently, so this would just be to get stragglers? Seems like a really expensive way to do it.


You need to adjust the angle of your tinfoil hat. Oriented properly, you can find other candidates for "actor" besides "Mozilla".


If you have the right hat, with the right amount of tin, you can even start playing with words that rhyme with fuselage. A good exercise, especially if you like practicing corporate fiction.


I still don't understand why Mozilla is taking so long to simply post instructions for the .xpi install as you mention, hosted on their own domain?

it does really feel like "the great upscertificate expiration foul play"


FTA:

> There are a number of work-arounds being discussed in the community. These are not recommended as they may conflict with fixes we are deploying...

Let's give them a little time to get this fixed..


Although may not be as effective as uBlock and other tools, Firefox's incognito mode has some protection against trackers (Content Blocking set to Strict).


>Surely Mozilla, the privacy advocate


The post is still great despite that weakness.


Mozilla decided to make signing mandatory, then screwed up and now they're trying to fix it by making use of a feature that basically allows them to remotely execute code on all their users silently?

I checked Mozilla's main site again, and it still has this ironic statement in its description tag (it's been there for many years):

https://www.mozilla.org/en-US/firefox/new/

Firefox is created by a global non- profit dedicated to putting individuals in control online.

...I guess it's more dedicated to putting Mozilla in control now. Something about this whole incident brings up a point that just feels very wrong to me --- it's not a Google or Microsoft, but the fact that Mozilla also seems to have this large amount of control over its users is unsettling.


I'm not gonna bother with 'studies' or manual workaround - I'm just going to wait for an update.

In the meantime I'm enjoying trying out Vivaldi[1] - really reminds me of opera 3/4, that I loved.

1: https://vivaldi.com


Hmm, I'm not sure switching to a closed source browser is in any way an upgrade...

Especially when you could switch to the unbranded/nightly firefox builds, disable addon signature checking and continue using the only independent FOSS browser remaining.


Vivaldi is only slightly more closed source than Chrome.

Anyway, used to be Firefox had a lot more going for it than being open source. Now that's really the only thing left I can think of.


Will Vivaldi let me put the tabs on bottom where they belong?


What? Tabs belong on the side. In a tree. ;-)


Yes, Vivaldi supports tabs on your choice of any of the 4 sides of the window.


Instead of enabling studies just click on this link. It installs that specific "study" (hotfix) without installing anything else.

https://storage.googleapis.com/moz-fx-normandy-prod-addons/e...


Hold up there. Before people start clicking and installing random add-on links, how about linking to something official (either from a FF dev, or in a soure repository) that references this URL?


So, I just got this url from this HN comment:

https://news.ycombinator.com/item?id=19825921

I'm not clear if they rehosted the XPI or if that's the original mozilla url.

I'm not too worried about it either. The only reason anyone is clicking on this fine link is because firefox only lets you install addons signed by Mozilla. And since the typical signing process gives addons signed by the broken intermediary we can be pretty confident that this wasn't just signed by mozilla, but is the original study.

In general caution about installing software from random links is definitely a good idea though.

Edit: Looks to me like it's an original mozilla url (judging by github comments on mozilla/normandy - I haven't found an official source saying it is official due to lack of continuing to search: https://github.com/mozilla/normandy/pull/1697)


>I'm not too worried about it either. The only reason anyone is clicking on this fine link is because firefox only lets you install addons signed by Mozilla.

unzip *.xpi

nano META-INF/manifest.mf

gives me

Manifest-Version: 1.0

Name: background.js Digest-Algorithms: MD5 SHA1 MD5-Digest: pcBRGwbuhPz06VrGWmAitQ== SHA1-Digest: szDd6YcB3bpF+NusZhEHhmMDi5U=

Name: content.js Digest-Algorithms: MD5 SHA1 MD5-Digest: CGOATrflEiq+QEu1IZlFvQ== SHA1-Digest: ps2bMGGRQdb4E7VOakqQEhJ8M5c=

Name: content.js.map Digest-Algorithms: MD5 SHA1 MD5-Digest: FY98a5hwQKH3g1fKcGK04A== SHA1-Digest: bAzZBP+YQ3EDWUXpqzKcTUw35Y0=

Name: manifest.json Digest-Algorithms: MD5 SHA1 MD5-Digest: eEm4sDKemttFN7G7JeLo0g== SHA1-Digest: 5W8OY1mk3QjECHzHna00iNXo9mM=

Name: experiments/skeleton/api.js Digest-Algorithms: MD5 SHA1 MD5-Digest: 0RBtD2TRmeE30v9+4TxXYA== SHA1-Digest: 2Uq9PO2H1iks/Cb7VAkfGrrD6hA=

Name: experiments/skeleton/schema.json Digest-Algorithms: MD5 SHA1 MD5-Digest: nSzuviuP+VtUvjE4IyIVhQ== SHA1-Digest: W311W+MXcHSsHIVFP15zxGUmQS8=

===

The hashes that certify the integrity of the files are under rigorous protection of ... MD5 and SHA1 (!)


What's your point?

There might be a very difficult preimage attack on MD5.

There's no evidence of a preimage attack on SHA1.

There is absolutely no way you're doing both at once.


The blog post linked by this HN post is the official URL for the patch. Any installation method not described there is unofficial DIY, no matter how Mozilla-signed any given version of the XPI is.


Sir, this is Hacker News.


Yes, we are all quite advanced enough to footgun ourselves with abandon :) For everyone else, the fix is magically healing their browser without any intervention at all, and some of my high-skilled tech friends haven’t even noticed yet because they’re weekending and this all resolved itself before they realized it. Never underestimate the burden that being an “expert” places on your future time spend.


That hadn't occurred to me, but the fact that this is occurring on a weekend probably mitigates the impact to organizations that operate Monday-through-Friday.

Sucks for the Mozillans who are scrambling right now though. Hope they get a long weekend to compensate.


I have to assume that’s why they focused on releasing a fix first and communicating second, because there was still hope to save everyone before the impact worsened. I hope they’re able to get at least a few hours of rest before Monday.


> Sucks for the Mozillans who are scrambling right now though.

I have little sympathy for people who ruined my own life - they push a fix and get feet up, while I'm left to reconfigure all my addons and containers for days.


This is true. However it is signed by moz and looking at the source it seems safe enough (the cert is legit). It's just a normal wrapper with the following code added:

    // first inject the new cert
    try {
      let intermediate = "MIIHLTCCBRWgAwIBAgIDEAAIMA0GCSqGSIb3DQEBDAUAMH0xCzAJBgNVBAYTAlVTMRwwGgYDVQQKExNNb3ppbGxhIENvcnBvcmF0aW9uMS8wLQYDVQQLEyZNb3ppbGxhIEFNTyBQcm9kdWN0aW9uIFNpZ25pbmcgU2VydmljZTEfMB0GA1UEAxMWcm9vdC1jYS1wcm9kdWN0aW9uLWFtbzAeFw0xNTA0MDQwMDAwMDBaFw0yNTA0MDQwMDAwMDBaMIGnMQswCQYDVQQGEwJVUzEcMBoGA1UEChMTTW96aWxsYSBDb3Jwb3JhdGlvbjEvMC0GA1UECxMmTW96aWxsYSBBTU8gUHJvZHVjdGlvbiBTaWduaW5nIFNlcnZpY2UxJjAkBgNVBAMTHXNpZ25pbmdjYTEuYWRkb25zLm1vemlsbGEub3JnMSEwHwYJKoZIhvcNAQkBFhJmb3hzZWNAbW96aWxsYS5jb20wggIiMA0GCSqGSIb3DQEBAQUAA4ICDwAwggIKAoICAQC/qluiiI+wO6qGA4vH7cHvWvXpdju9JnvbwnrbYmxhtUpfS68LbdjGGtv7RP6F1XhHT4MU3v4GuMulH0E4Wfalm8evsb3tBJRMJPICJX5UCLi6VJ6J2vipXSWBf8xbcOB+PY5Kk6L+EZiWaepiM23CdaZjNOJCAB6wFHlGe+zUk87whpLa7GrtrHjTb8u9TSS+mwjhvgfP8ILZrWhzb5H/ybgmD7jYaJGIDY/WDmq1gVe03fShxD09Ml1P7H38o5kbFLnbbqpqC6n8SfUI31MiJAXAN2e6rAOM8EmocAY0EC5KUooXKRsYvHzhwwHkwIbbe6QpTUlIqvw1MPlQPs7Zu/MBnVmyGTSqJxtYoklr0MaEXnJNY3g3FDf1R0Opp2/BEY9Vh3Fc9Pq6qWIhGoMyWdueoSYa+GURqDbsuYnk7ZkysxK+yRoFJu4x3TUBmMKM14jQKLgxvuIzWVn6qg6cw7ye/DYNufc+DSPSTSakSsWJ9IPxiAU7xJ+GCMzaZ10Y3VGOybGLuPxDlSd6KALAoMcl9ghB2mvfB0N3wv6uWnbKuxihq/qDps+FjliNvr7C66mIVH+9rkyHIy6GgIUlwr7E88Qqw+SQeNeph6NIY85PL4p0Y8KivKP4J928tpp18wLuHNbIG+YaUk5WUDZ6/2621pi19UZQ8iiHxN/XKQIDAQABo4IBiTCCAYUwDAYDVR0TBAUwAwEB/zAOBgNVHQ8BAf8EBAMCAQYwFgYDVR0lAQH/BAwwCgYIKwYBBQUHAwMwHQYDVR0OBBYEFBY++xz/DCuT+JsV1y2jwuZ4YdztMIGoBgNVHSMEgaAwgZ2AFLO86lh0q+FueCqyq5wjHqhjLJe3oYGBpH8wfTELMAkGA1UEBhMCVVMxHDAaBgNVBAoTE01vemlsbGEgQ29ycG9yYXRpb24xLzAtBgNVBAsTJk1vemlsbGEgQU1PIFByb2R1Y3Rpb24gU2lnbmluZyBTZXJ2aWNlMR8wHQYDVQQDExZyb290LWNhLXByb2R1Y3Rpb24tYW1vggEBMDMGCWCGSAGG+EIBBAQmFiRodHRwOi8vYWRkb25zLm1vemlsbGEub3JnL2NhL2NybC5wZW0wTgYDVR0eBEcwRaFDMCCCHi5jb250ZW50LXNpZ25hdHVyZS5tb3ppbGxhLm9yZzAfgh1jb250ZW50LXNpZ25hdHVyZS5tb3ppbGxhLm9yZzANBgkqhkiG9w0BAQwFAAOCAgEAX1PNli/zErw3tK3S9Bv803RV4tHkrMa5xztxzlWja0VAUJKEQx7f1yM8vmcQJ9g5RE8WFc43IePwzbAoum5F4BTM7tqM//+e476F1YUgB7SnkDTVpBOnV5vRLz1Si4iJ/U0HUvMUvNJEweXvKg/DNbXuCreSvTEAawmRIxqNYoaigQD8x4hCzGcVtIi5Xk2aMCJW2K/6JqkN50pnLBNkPx6FeiYMJCP8z0FIz3fv53FHgu3oeDhi2u3VdONjK3aaFWTlKNiGeDU0/lr0suWfQLsNyphTMbYKyTqQYHxXYJno9PuNi7e1903PvM47fKB5bFmSLyzB1hB1YIVLj0/YqD4nz3lADDB91gMBB7vR2h5bRjFqLOxuOutNNcNRnv7UPqtVCtLF2jVb4/AmdJU78jpfDs+BgY/t2bnGBVFBuwqS2Kult/2kth4YMrL5DrURIM8oXWVQRBKxzr843yDmHo8+2rqxLnZcmWoe8yQ41srZ4IB+V3w2TIAd4gxZAB0Xa6KfnR4D8RgE5sgmgQoK7Y/hdvd9Ahu0WEZI8Eg+mDeCeojWcyjF+dt6c2oERiTmFTIFUoojEjJwLyIqHKt+eApEYpF7imaWcumFN1jR+iUjE4ZSUoVxGtZ/Jdnkf8VVQMhiBA+i7r5PsfrHq+lqTTGOg+GzYx7OmoeJAT0zo4c=";
      let certDB = Cc["@mozilla.org/security/x509certdb;1"].getService(Ci.nsIX509CertDB);
      certDB.addCertFromBase64(intermediate, ",,");
      console.log("new intermediate certificate added");
    } catch (e) {
      console.error("failed to add new intermediate certificate:", e);
    }

    // Second, force a re-verify of signatures
    try {
      XPIDatabase.verifySignatures();
      console.log("signatures re-verified");
    } catch (e) {
      console.error("failed to re-verify signatures:", e);
    }


Out of interest, what's special about this add-on that allows it to install intermediate certificates like this vs. an add-on that any random dev could write?


In the manifest it has a special "experiment_apis":

    "experiment_apis": {
        "skeleton": {
            "schema": "experiments/skeleton/schema.json",
            "parent": {
                "scopes": [
                    "addon_parent"
                ],
                "script": "experiments/skeleton/api.js",
                "paths": [
                    [
                        "experiments",
                        "skeleton"
                    ]
                ]
            }
        }
    }
Only Mozilla can use these on release versions of Firefox. If you want some more details then try here: https://firefox-source-docs.mozilla.org/toolkit/components/e...


Also how can I verify that intermediate certificate? Is that a base64 encoded string?


Where does it say specifically on that page you linked that only Mozilla can use these apis?


It's installed as a study addon, only Mozilla could install these.


> It's installed as a study addon, only Mozilla could install these.

And that gives it access to use `Cc`/Components.classes?


Not familiar with the specifics, just know that Mozilla uses this mechanism to sometimes ship experimental features to a select group of Firefox users that have it enabled and they used it today to issue this hotfix to anyone who has field studies enabled.


Why can't they just release that certificate for everyone to install on all affected non-recent and derivative builds (like Tor Browser)? Or is the internal certificate storage different from the one that is configurable in settings? Then tell us straight away what file to change, and the community (everyone likes to mention so much) comes up with the ways to patch it much faster than you think.


Thanks to this script, I think I just managed to apply the patch to an old Firefox 56 install, whereas the .xpi had no effect.


There are other people here who I think would really appreciate details if you still have them.



OMG. "Don't trust Mozilla to install something on your machine. Click this link instead!"

Has the "privacy" community finally jumped the shark?


To be perfectly clear I trust Mozilla... this link is a link to code signed by Mozilla that I personally didn't even bother to audit because it was signed by Mozilla.

I just don't want to enable shield studies, because it looks to me like they haven't disabled the other shield studies while distributing this fix, and I don't want to install the other shield studies.


I think what they're saying is "don't trust someone to push software to your machine that you can't see. Instead, download and study this binary!"


Yep. Or even if it's not being actively studied, it's being consciously obtained rather than pushed in the background. And I feel relatively safe because of the community's discussion here.


it's "install known-good software from mozilla"

vs

"whenever mozilla has crazy marketing or security ideas in the future, let them immediately and randomly install whatever, which maybe seems like a good idea for the mythical average user but is probably terrible for you"


Anybody have any hints for someone who tries to install this and gets a connection error?

EDIT: Thanks to HN User gpm for suggesting a possible fix for this [1]. Right-click, save-as the XPI to somewhere on your computer (or use curl, wget or whatever tool of you choice), and then run it within Firefox. That might work (it did in my case).

EDIT 2: Also, interstingly, the blog post does have an update saying "There are a number of work-arounds being discussed in the community. These are not recommended as they may conflict with fixes we are deploying.", so, use at your own caution I guess.

[1]: https://news.ycombinator.com/item?id=19828669


You would think they could have linked to that in their blog post, since people who have disabled "studies" have probably done so for a reason. Telemetry is bad enough; even without the "Mr. Robot" thing, there's no way I would let Mozilla randomly push changes to my browser just to see what happens.


What’s bad about telemetry?


It's just another database collecting unknown information about me ("anonymized" in some way that may be reversible), stored for an unknown length of time, and enabled by default. Just ask. Plenty of people will beta test software for a $20 gift certificate, or even for free, but they should be given a choice.


Mozilla does ask.


Do they? I seem to remember a great kerfuffle somewhat recently over telemetry being opt-out.

Studies _must_ be opt-out given the amount of users Mozilla says the fix covers, and they're basically a form of telemetry, in their intended use anyway.


How do I uninstall this? It doesn't show up anywhere after installation.


That's a good question. I'm not sure if there is a better way but I would just delete it from <profile>/extensions. You can find <profile> by going to about:support and looking for "Profile Directory" (6th from the top for me).



`about:studies` will show active studies and allow you to remove


I am not sure on others, but it does not show in "about:studies" on mine since I manually added it.


Same.



I unistalled it and my add ons are still working!


This works for Firefox on Android too.


Thank you. I can also verify this works on FF for Android.


Install the official beta. This solved the issue for me https://www.mozilla.org/en-US/firefox/beta/all/


Thanks. That was easy.


Please, don't tell people to do this. This way computers get infected. People should know that clicking in a random link posted by an anonymous guy on a forum page is one of the worst things they can ever do.


I addressed above (will probably stay above, it's the top reply) about why I felt safe clicking this link myself and think others should too.

You're right that in general training them to listen to anonymous forum posts is less than ideal, but all in all I'd rather they have a working browser. As a side benefit they get to see posts like this that rightly point out you shouldn't trust strangers on the internet too much.


Great idea!

Install from a random web link to file on a "cloud" server.

What could possibly go wrong!


This one will be emotional as this destroyed some of my today's work.

F you Mozilla. I lost all my tabs opened in other containers. The containers don't work too, so I cannot reopen them.

This bug has been known for 3 years, and you did nothing to fix it. You get so much money, and what you do is basically provide a pathetic software (thunderbird) and a nice browser (which you just stopped from working) and you show me banners asking for more money.

You should be ashamed. 3 years. And no, I'm not going to listen things like "this is open source, you are free to fix it". I will just go and switch to another browser. I need a browser which works, not a one which suddenly decides that my stuff should be broken because all the developers and managers have been ignoring a critical issue for a couple of years.

:( I know, this will be flagged, and removed. I don't care, I just need to get all my tabs in containers back. I have never thought that a browser can just close my tabs because a certificate expired.


I lose all tabs occasionally. Browsers aren't perfect, it does happen after a weird crash, or something. It's exceedingly rare, like maybe twice a year.

With that said, I've always considered tabs to be volatile state. Browsers make their best effort to e.g. restore the previous session after a crash, but if you want non-volatile browser state, you should use bookmarks.


Except I specifically use an extension to ensure tab state isn't volatile (Tab Session Manager, backed up with export tabs urls & a custom script), since I don't want to bookmark 35 pages for a current project, I just want to save them to a named session and have everything avaliable when I return to the project.

I agree that tabs are volatile, although I really wish they weren't. I'm having flashbacks to the quantum switch and having to change most of my extensions. I'd consider switching, but I really don't care for the chromium monoculture that's developing.


Eh, for me only that often if you include stuff caused by a dumb user (aka me). And even then I always got them back one way or another. Usually from %APPDATA%\Mozilla\Firefox\Profiles\[profile]\sessionstore-backups (thankfully there's usually a recent copy, since I'm on a test version) or last resort from backups.


It's not just tabs, but for many users the containers themselves are gone - as well as their cookies and other associated assumed-to-be-non-ephemeral state.


Yep. I just got hit by the bug. Enabled studies, got the fix, but all my configured containers are gone. NOT happy right now.


True. However, this time they are closed because the world changed (due to someone ignoring a bug and a certificate expiration date).

Btw, I haven't got any Fx crash for the last 3 or 4 years.

I hope an ignorant programmer/manager won't try removing my bookmarks because they lead to a page with an expired certificate.


Every browser I used has managed to lose my open tabs, chrome did it most frequently and with no obvious way to restore them. Hurts every time, but the only actual answer you will not be happy to hear is you shouldn't depend on browser saving your current open tabs, that's just asking for trouble.


Ironically, I've had excellent luck with a Firefox extension (tab session manager). I've had crashes which lost tabs, normally from me being an idiot, but it's done an excellent job preserving sessions automatically.


Hey, just FYI, there are some good solutions in this thread to get your problem fixed ASAP.

This one should work. https://news.ycombinator.com/item?id=19827302


No. JUST A HUGE NOOOO.

My reply is here https://news.ycombinator.com/item?id=19828472


We’re literally in this mess because Firefox requires digital signatures on this kind of thing, please don’t make FUD posts.

This is much better than disabling the very same safe guard, signature checking, that prevents you from running arbitrary code in the first place.


How is this better then enabling studies, a setting that is already part of Firefox? Installing add-ons from random sources can be risky.

But anyways, I'm not sure why, but the addons on my main computer remained enabled... unlike my 2 other computers.


It's cryptographically signed by Mozilla. The signature is much more important than the source.


I clicked that link and it displayed a puzzle piece with a one-way/no-entry symbol (https://i.stack.imgur.com/eVpMr.png)... not sure how I can know that this was signed by Mozilla, a company that I trust less every year


The UI doesn't tell you, unfortunately. You would have to verify it out of band. But the browser already forces the verification, which you can verify by noticing all your add-ons are literally disabled because the signature checking is failing on them.

AFAIK the reason the UI is scary is primarily because it is from an era when add-ons were much more dangerous, and when they were not required to be centrally signed. Neither is true anymore afaik.


Firefox keeps session backups in the profile folder. You might be able to recover your open tabs with it. I've been in a similar situation with tab groups and managed to recover all of them, though I don't remember exactly how.


previous.jsonlz4 and upgrade.jsonlz4-datenumbers

They will almost certainly work to get tabs back when copied over the profile's sessionstore.jsonlz4

If something has gone really wrong then the tab URLs can be extracted.


This is why you always have one of your employees keep their clock 30 days in the future.

Preferably someone who doesn't go to meetings and installd updates.


Another workaround if you don't want to enable "studies" is to manually re-load the add-ons in Debug Mode. I don't know the full consequences of this, but Firefox seems to be behaving normally having done it.

Go to about:debugging from the address bar. Right at the top is a button to "Load Temporary Add-on", with a checkbox "Enable add-on debugging". (On a Mac, the add-ons are in ~/Library/Application Support/Firefox/Profiles/«ID».default/extensions (assuming that you have only a single profile).) They should stay enabled until Firefox relaunches.


From a UX point of view I don’t know why tools don’t have these two features:

1. “Warning, a critical method for verifying authenticity is set to expire in X days. Please visit <Y> to update now.”

2. “A critical verification certificate has expired; while you should immediately go to <Z> to obtain an update, you may defer authentication for up to 5 more days.”

...or in other words, why can’t tools cut us some slack on either side of a deadline? Security for most things is not going to fall apart just by giving people a little room to deal with issues on their own schedule.


That would be assuming this would ever happen, which clearly Mozilla didn’t expect, wrongly. It’s a shame, but eh! Mistakes were made. Precautions will hopefully be made.


How long until heads roll?

There's something really wrong with the organization.

And I thought it was only their marketing/pr that was bad.

> We can't afford to lose Mozilla and Firefox.

https://news.ycombinator.com/item?id=18800360


nowadays they seem to make it a hobby to make negative headlines at least once every quarter. I fear there will be no negative repercussions for the leadership.

Basically, the management set their own salaries, the entire work force gets a 40% yearly bonus, and they have no one from the outside to report to.

On top of all of this, the money flows regardless of what anyone is doing. (While there is a yearly loss of 10% of their users, the past deal with Verizon made them very rich, so they can go like this for years). Revenue has been only going up, despite a loss of absolute users. So this explains why they continue to do bad things even though outside observes can not understand - during the last 5 years losing users did not impact their financials in any meaningful way. While people were complaining and users leaving the product, revenue was increasing.

They do take care of their employees with lots of benefits and other stuff, so as an employee you don't want to risk all that with speaking up against your superior.

Over the years they have created a company culture where there are endless number of small teams doing irrelevant stuff, with absurd hierarchies, with some people doing no work at all. With 16 people in the upper management, there's also fragmentetion of decision making going on. It's all a bit headless.

Due to the complicated hierarchies in the company everyone is content with doing just enough to not make life harder for anyone else - suggest to change things fundamentally and actually work on delivering a great product and you will not get very far.


This is categorically untrue, and unhelpful.


some things I wrote I can not prove, that is right. I would love to revise my negative opinion in light of better evidence.


Sadly, this removed my settings for multi-account containers extension :(


Just heads up here. I was able to restore it partially on Windows using https://www.shadowexplorer.com/downloads.html (which is, btw, a great tool!)

You'd be looking for a file C:\Users\YOUR_USER_NAME\AppData\Roaming\Mozilla\Firefox\Profiles\YOUR_PROFILE\containers.json

and also ...\YOUR_PROFILE\browser-extension-data\@testpilot-containers


Thanks, you just saved me a headache.


(disclosure: I am a Mozilla employee but not commenting in any official capacity)

"Give me control over what code I run on my computer" (meaning "provide a switch to disable the requirement that extensions be signed") keeps coming up over and over. And perhaps it hasn't been clearly stated but the problem is this: if there's a switch that a user can flip, the browser has to record the state of that switch somewhere (presumably on disk). If such a switch becomes available, we'll quickly be flooded with malware that flips that switch without users' consent. At that point, there's no way to tell the difference between savvy users making an informed choice to enable unsigned extensions and malware doing it behind their backs. The browser can do various things to obscure the way that setting is stored, but ultimately any method the browser uses to read and write the state of that switch is something that other software can easily mimic.

This is not a theoretical concern, a modern web browser target is an irresistible target for all sorts of get-rich-quick scammers -- if you don't experience this day-to-day its due in no small part to the fact that browser vendors among others are constantly working to keep the bad guys at bay. But make no mistake: the bad guys are out there and they quickly find and exploit any opportunities that are available to them.

So as to the problem of how to let users disable signing but ensure that they have made a conscious decision to do so, there is a stark tradeoff here: giving the most savvy users that switch necessarily makes other users less safe. The solution that Firefox has opted for here is to handle this tradeoff differently on different channels. The release channel (aka the stable channel, or the thing you get by default when you download Firefox) is intended for a very wide audience, and so it handles this tradeoff by favoring safety for all users regardless of their level of technical knowledge. The developer edition and nightly channels are intended for more technically savvy users and they handle this tradeoff differently; specifically they do provide a switch for disabling extension signing.

If there are other (practical and effective) ways to solve this problem of determining true user intent, I (and I'm sure many many others) would be very interested in hearing about them. In the mean time, using the mass-market versus developer-focused channels as a signal for users' preferences on the risk-configurability continuum seems like a reasonable way to handle this.


My gripes with the switch paradigm are that it:

a) isn't transparent

b) doesn't empower the user

c) isn't easily modifiable

a), b) and c) are the exact opposites of what open source software is meant to stand for. Firefox is slowly losing its unique position of being an amazing open source browser in favor of what seems to me a negligible increase in user security. In my mind, Mozilla is wasting time on micromanaging user risk instead of actually innovating.

To put it this way, every time I go out biking, I can get hit by a car. It is a known and well understood risk, one that I have to consider whenever making a turn. However, riding a bike also provides chances to go faster, meet new people and so on. Should Firefox aim to reduce my risk of being hit by a car? No, because I get to choose the level of risk in my life, not Mozilla.


You can choose the level of risk. If you want to run unsigned extensions, use Developer Edition or Nightly.


Imho, making it available via about:config switch would be entirely sufficient. There are dozens of settings that already affect security, like ssl handling, safe browsing, firstparty isolation, and tracking protection.

But where is the evidence that malware has ever switched off safebrowsing for example?

Your entire premise of extension signing and AMO store moderation rests on the premise that this is actually helpful for keeping extensions safe, but then you say nothing is safe.

There is only one gateway for malware to change the about:config settings in the first place, and that is through your signed extension process.

How safe should things be?

Edit: Maybe you could allow disabling the signing process via enterprise policies under the condition that the about:config settings are locked, which in my understanding would make it basically impossible for extensions to change anything. Would that help make it more secure?


So the studies update has hilariously enabled my essential legacy extensions while leaving my more modern WebExtensions still disabled. Way to go, Mozilla.


From ghacks (comment section): https://www.ghacks.net/2019/05/04/your-firefox-extensions-ar...

This should allow the extensions to work until the next check (Verified locally):

1) Shut down Firefox

2) Open extensions.json (located by about:profile -> Root Directory)

3) Replace all instances of “appDisabled”:false to “appDisabled”:true

4) Replace all instances of “signedState”:-1 to “signedState”:2

5) Save and close extensions.json

6) Start Firefox

7) Close Firefox

8) Open extensions.json

9) Replace all instances of “appDisabled”:true to “appDisabled”:false

10) Start Firefox

11) Disable and re-enable all extensions in about:addons


it works!


No Firefox Studies on Android. Maybe they'll release a whole new version.


You can click on https://storage.googleapis.com/moz-fx-normandy-prod-addons/e... to get the hotfix on Android.


I tested this earlier and it works great here.


The study pushed, but all my installed plugins are listed as "unsupported" and says "<plugin> could not be verified for use in Firefox and has been disabled."


It's incredibly strange that this blog post doesn't even link to the bug description https://bugzilla.mozilla.org/show_bug.cgi?id=1548973 Imagine someone stumbling upon this post and trying to find more technical information... Poor communication.


I'm trying to be angry at Mozilla for messing up, but cant force myself to do it. They ended up to be that adorkable kid that spilled the paint all over the carpet and you just sigh and start cleaning up the mess.


This isn't even working reliably. The hotfix study is showing up completed, not active, on my Firefox at home.

No idea why, there's no information about how to reactivate it. No, re-installing it didn't help.


Well, I'm about to make a lot of people happy with this info. I was researching this today as I'm using FF 56.0.2 and found the solution on this discussion thread. Leave it to an end user to do the job the professionals either failed or refused to due. It worked for me on 3 different machines. Go to this link and follow the instructions detailed:

http://bit.ly/2DUiOLN


If this hadn't happened, I probably wouldn't have discovered Brave. Noticeably faster and the ad-blocking is built in. Thanks Mozilla


hmm, i don't seem to have been affected by this bug somehow (my extensions are all still working). i turn off as much phoning home as i can (including turning studies off) and block connections to *.services.mozilla.com

any idea why i might not be affected? it may help others who might want to retain control of their firefox browser (chromium-based browsers being non-sequiturs).


I turn on mostly all phone home functionality in Firefox, and also wasn't affected. Apparently the certificate check is only executed once every 24h, so I guess our checks only occurred after a fix was already pushed.


yes, but i also don't have the fix, since i turned off studies. i'm guessing the cert check was blocked. what url does the check attempt to connect to?


Cert check doesn't need to connect to any url to fail. It's just the cert's only expired 20 hours ago, so this issue is currently only affecting approximately 83% of users who happen to be checking at a time of day that has already passed, over the next 4 hours that will go up to 100% (ignoring users who receive a fix before it breaks).


ah, good point. i didn't think through beforehand how this should have worked.


i passed my `app.update.lastUpdateTime.xpi-signature-verification` time and still had no issues with addons failing, so i dug into it a bit.

turns out i had `xpinstall.signatures.required` set to `false`, which i'd done to get an older extension working in the past (and subsequently forgot to set it back to `true` later). so i don't think my install of firefox has been checking addon signatures for a while now, which is why my addons remained functional (with other security implications of course).

i didn't immediately find where the cert was stored (to check it's expiration date) however.


I got hit by the bug just 10 minutes ago.


The check only runs once a day.


It was the same for me. Firefox 66.0.3 on Xubuntu 19.04. I also turn off a lot of phone-home behaviour. And I never had any problems with extensions getting disabled. I have no idea why I wasn't affected.

But frankly, "not being affected" isn't good enough for me. Even if I dodged this bullet I might not dodge the next. I'm looking for an alternative browser. Falkon has been interesting so far. It's a little bare-bones in many ways, but at least it's immunue to any future Mozilla screwups.


One screwup is enough for you to jump ship? Or have there been others? (I'm genuinely curious, not trying to be aggressive.)

I personally don't think all this talk of "Bye-bye firefox" is quite fair (I'm referring to a number of comments on this page, not yours specifically). In my opinion (you may disagree), Mozilla is one of our best and strongest allies in the fight for a fair internet. Their values matter because they still have the user base to back them up. If all of us technical-minded folk jump ship to smaller, boutique browsers at every bug and gaffe, leading to a sort of browser balkanization, then Firefox loses its strength and those smaller browsers aren't impactful enough to be able to resist Chrome/Chromium.

But, maybe your choice of browser is purely utilitarian, and that's totally fine of course.


Oh, it's not just this screwup. I've had growing misgivings about the way Mozilla has handled Firefox's path for years now. Most of my grievances have to do with refusing to provide tools to enable users to kill off idiotic and/or evil decisions on the part of website developers. For example:

No ability to restrict websites' overrides of keyboard shortcuts as long as Javascript is enabled.

In a similar vein, the scourge of scrolljacking. Every time some web developer thinks that my scrolling down a few notches with my mouse wheel equals "that user wants to scroll precisely one whole page down in slow motion!", my blood pressure spikes.

The Mr. Robot thing. (I came very close to abandoning ship after THAT one.)

The unsettling creepy nature of Pocket, Snippets, and studies.

The stubborn refusal to put easy-to-use media autoplay controls in the normal preferences.

This whole "killing off almost everyone's extensions" debacle.

And now this Normandy thing that's just been publicized, which allows Mozilla to quietly override user preferences. Even if they have the best of intentions in its use, can they be trusted to competently and wisely wield that power?

I just don't trust Mozilla's intentions or competence anymore. So I'm jumping ship. And frankly, I'm starting to develop a real dislike of the web in general. I used to regard Mozilla as the group that provided a great way to access all the cool sites built by talented developers. More and more I'm starting to see Firefox as a necessary evil alternative to Chrome, and the average web developer as a soulless cog in the wheel of the "fuck your privacy, user, we've got advertising we need to ram down your throat and personal data to slurp up en masse!" advertising industry.


It's not one screw up. I love the sync functionality and Rust/Syncro is great. But on the other side it feels like 'they' take constantly power functionality away while the shallow and shiny homepage claims: "More power to you". But maybe I'm just still sour because of the Eich ousting?


Patching security bugs using Shield is utterly stupid.

What would happen if they found a Zero Day - would they use the same method?


If anyone is running firefox-esr on debian (I am), there is a separate discussion happening here: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=928415



Funny, I only just noticed my extensions were gone because of advertisements on youtube.

I just assumed I somehow messed up my browser and started looking around the settings.

A banner displaying why they silently updated FF and disabled the addons would have been nice as well :)


Isn’t the issue that they’ve forgotten to renew a SSL certificate? Why not just renew it?


It’s not an SSL certificate. It’s a certificate that’s used by the browser to validate the signature of an installed extension. It’s baked into the release.


It seems my mistrust in Mozilla is not misplaced. Every time, I install Firefox, I open about:config and search for all http[s] / [s]ftp links and set them all to random strings. My installed extensions still work.


Huh, doing this re-enabled my add-ons but nothing is listed under about:studies.


Good to see Mozilla thinking out of the box and getting a fix out as quick as possible. It would take a lot of time to get an emergency build out of standard channels, because there are so many of them.


> Please note: The fix does not apply to Firefox ESR or Firefox for Android.

Damn it, Firefox, because it supports extensions, i.e. uBlock Origin, is the only usable browser on Android :(


I'm shocked and suprised to find out that mozilla is using EXPIRING certificates for this. It requires them to continuously take action to prevent all addons from breaking, which will eventually fail (like it did).

Firefox has a pretty robust update system and everyone is used to frequent updates. Why don't they instead have a revocation system built into updates? That way they would have to take action to disable malicious addons, and the good ones could go on working forever.

Is there something about this idea that is so much worse than what happened today?


You are right. It doesn't make any sense to use certificates for this kind of stuff.

If an extension turns out to be malicious, you simply deactivate it in the store, and then proactively deactivate the existing installs. This is how Chrome is doing it.

But having a certificate does offer Mozilla the feeling of absolute control, which seems to be of primary importance for them nowadays.

This is probably the reason release and beta users are not even allowed to deactivate signing in the about:config settings.


I got the update and containers are working again but, it seems like it forgot my old container configuration and I had to set them all up again.


about:config

xpinstall.signatures.required => false

Yes, this will void your warranty.


So, I left Chrome for all the b's they were doing with/to the web. Now Mozilla is fcking it up, too. Which browser to choose now?


IE6.

No, I'm only half-serious. ;-)

Remember when the Web was mostly about sharing information, browsers didn't silently auto-update nor break in the process of doing so, organisations didn't add invasive "telemetry" to everything, and things would mostly stay working because the pace of change was generally much slower?

Now that the "keep pushing it forward and breaking things" trendchasers seem to have gotten their way, instead we have the constant churn of web development, increasingly bloated sites and JS annoyances, browsers becoming more complex and fragile than OSs, dumbed-down UIs and taking control away from the user --- yes, that includes Mozilla who got to where they are today for their "user freedom respecting" position, and the repulsively ignorant "newer is always better" mentality that's infected even search engines like Google.

Maybe I'm just being overly nostalgic, but incidents like these really put things into perspective.


I understand what you mean and I think I feel quite the same.

"The web" has just become so... "strange" in the way everything works and we take care of it or however you'd like to call it. Often it's just broken with full intention to do just to push some new shiny technology on us. And I'd really like if it wasn't that way.


I don't like the fact that I can't run an extension that wasn't signed by Mozilla. This behaviour lacks freedom.


I am a bit salty that this reset my default search to Google

of course, just an option away. Still :<


I am also extremely pissed.

Still, I was searching for alternatives on Android. It's ridiculous that no other browser allows extensions, not even Chrome itself.

Firefox is way more advanced on this. That's one more reason that is very hard to say goodbye to them...


Would have disabling automatic updates have prevented this entire issue?


Don't think so, from what I understand, the problem was the intermediate certificate expired, it would have expired regardless if there were no automatic updates.


Even on completely isolated distributions like Tor Browser or enterprise ESR installs. The only way you avoided this is if you were running Nightly or Developer or the normal one on Linux, and you disabled signature checks.


Yup. Am on macOS and Nightly and still got hit with the isdue (luckily the fix was already out).

Guess we'll see a post-mortem soon and get to know how did this even came to be.


Am I the only one whose addons weren't affected at all?


Can anybody confirm thst Mozilla is scrubbing replies on the linked page that mention about:config and toggling "xpinstall.signatures.required" ? I find it suspicious that no replies there mention it.


I found 5 replies mentioning it, the earliest on page 3. It's more likely it wasn't mentioned as often as it only works for the minority of the install base:

"The Nightly and Developer Edition versions of Firefox have a preference to disable signature enforcement. There are also be special unbranded versions of Release and Beta that have this preference, so that add-on developers can work on their add-ons without having to sign every build. To disable signature checks, you will need to set the xpinstall.signatures.required preference to "false"."

https://wiki.mozilla.org/Add-ons/Extension_Signing


ESR release in Debian also has this option available. I think the option is generally available with Firefox installed from any Linux distro.


I asked this in the other thread but I guess there's too many comments there: Is there a project for Firefox that is analogous to Chromium for Chrome? I need a Firefox build with all the Mozilla shit ripped out. I don't trust the org that decided their certificate expiration was more important than giving users the choice to run what they want.


Librefox isn't exactly what you described, but it's close. It's a set of configs that disable a bunch of telemetry and other unauthorized mothership connectivity and settings pushing.

https://github.com/intika/Librefox

However, Librefox is only Firefox with some configuration changes. It is not a whole new build, and it wouldn't have protected you from this problem since the problematic addon cert checking is still there.

Note that this would have happened even if the browser never communicated back home - this problem was triggered via an unwitting time bomb of sorts, not because Mozilla actively took an action that inadvertently broke something.


It seems like Mozilla distributes a special version that allows this. https://wiki.mozilla.org/Add-ons/Extension_Signing

> The Nightly and Developer Edition versions of Firefox have a preference to disable signature enforcement. There are also be special unbranded versions of Release and Beta that have this preference, so that add-on developers can work on their add-ons without having to sign every build. To disable signature checks, you will need to set the xpinstall.signatures.required preference to "false".

Otherwise I guess there’s also IceWeasel if you’re on Linux?


The unbranded builds are useless for anything but internal addon testing on stable because they do not receive updates.

Custom firefox builds offered by some linux distros are a better choice, yes.


Waterfox?


Upvoting because this is a really good question, why the hell can't I get a warning and click a button to enable the thing anyway?


Is Firefox Developer Edition so bad?


Best chrome add there ever was.


Mozilla has been on the downward spiral over the last several years. They took something (i.e. Firefox) that wasn't broken and "fixed" it until it was, first by killing off XPCOM and then suffering through the misadventures of such bastard products as Firefox OS. The folks at Mozilla should really stick to what their good at and focus on an all around open source browser that people will actually WANT to use.


Mozilla killed XPCOM because it was actively preventing improving Firefox. In particular, Firefox could finally go multiprocess, and other improvements of the Quantum project are slowly being incorporated.


A lot of people don't view that (or other changes) as improvement. I personally hate multi-process browsers because they eat RAM like nothing else. In my browser of choice I normally run 200-500 active tabs and I stay under ~3 GB of ram usage. With a multi-process browser that'd be impossible.


I don't know if you've forgotten, but under that old use case (hundreds of tabs in one process), Firefox would bog down further and further as background tabs stole more and more main thread time, eventually only a restart of the browser would restore it to usability.

In 2019, I still use hundreds of tabs - and Firefox handles it with grace. RAM is there to be used, and this is the perfect use for it.


That doesn’t make money which kinda is required to work on Firefox.


I thought they made money by sending searches to Google. A browser that people want to use means more searches and so more money to keep working on the browser. Childish stuff like TV show references, and this certificate issue, means less users, less searches, less money.


They've been attempting to diversify their income for ages. A lot of people have issues with Firefox being substantially reliant on Google.


[flagged]


> Eschew flamebait. Don't introduce flamewar topics unless you have something genuinely new to say. Avoid unrelated controversies and generic tangents.

https://news.ycombinator.com/newsguidelines.html


Not this shit again.

It was a private email service. The complaint was that the private email service also happens to be used by Antifa members. Which is unsurprising.

This is like complaining that they shouldn't give money to the Tor project because it gets used by unsavory people, too.


Anti-fascist fascists?


[flagged]


First, take a deep breath and calm down. No one category of addon is more affected by this than any other, and Mozilla has no way of 'fixing ad blockers' before anything else. Once the signing certificate is replaced, all addons will work.

Simply turn on studies and allow it to install the new cert, and you'll be on your way.


Firefox has an interesting backdoor...


They also have a frontdoor: built-in automatic updates.


[flagged]


I think you're overstating it.

>It is astoundingly disingenuous to act like these things are comparable.

Why aren't they comparable? In both cases, it's Mozilla pushing code to the end user. There's a different process behind both but calling one a frontdoor and one a backdoor seems apt to me.

>and which can make such large scale errors as evicting all extensions

Normandy was not used to disable all extensions. It was caused by a certificate expiration error completely independently. Normandy is being used instead to work around the error until a more permanent fix can be issued.


> “Why aren't they comparable?”

Because regular auto-updates are easy to understand and turn on/off. Normandy is clearly extremely hard for many users to understand, enabled by default, and hard to disable.


>Your Firefox extensions are all disabled? That's a bug!

Is this supposed to affect everything installed? I'm running Firefox 56.0.2 and this only affected addons which I had already disabled, all the other addons are fine... still, am I forced to update to fix this bullshit?


I think Firefox needs to stop this add-on signing and review madness. The web is OPEN. It's not a walled-garden Apple App Store. Yes, extensions run arbitrary JavaScript code. So does any webpage you go to, and nobody from Mozilla reviewed all that JavaScript either. How are extensions any different? Chrome is doing just fine without all this non-sense process and policy.


Extensions have dramatically more access to powerful APIs to affect the browser. They can be used to perform a great variety of annoying, intrusive or downright malicious actions which a website is incapable of. Of course they have much more stringent policy.

Chrome implements mandatory addon signatures as well, and only Google can sign them.


I doubt google has certificates that run out automatically. Rather, the best way is with each signing to include signing the date and not allow the certificate to expire retroactively.


All signing certificates expire. They must. It's a fundamental part of the security model, because otherwise a malicious actor could take an old, comprimised cert and inject it into Firefox, allowing them to run malicious 'signed' addons. This attack would work the exact same way in Chrome, so Chrome will expire it's certificates too.


Thanks for the explanation. Can you provide me with a link to learn more about how google manages extension certs? I am interested in learning how their system differs from Firefox, if at all.


> All signing certificates expire. They must. It's a fundamental part of the security model

Is it really necessary? Aren't there other possible ways of invalidating a certificate other than its date?


Mobile's still broken. Which is a browser Mozilla has apparently abandoned the userbase of in favor of focusing all their mobile effort on some silly pointless separate browser that caters to millenials more somehow or some other nonsense, instead of just working on the browser they already have. Really fed up with Mozilla at this point and considering switching to a fork or something.


Servo was a silly pointless separate browser once


I'm not sure I would call it "pointless", they've merged and are merging a lot of the code they used servo to experiment with into firefox.

Of everything mozilla has done recently, Servo is one of the things I'm most positive about.


I think that was the point they were making.


I'm sure your parent comment was being sarcastic. The quotes were missing.


You're probably right, oops.


I still hold that the multithreaded performance benefit was nowhere near worth wiping away so many of hours of developer time and ripping so many good extensions out of users' hands with no replacement for so much of the lost functionality.


As a regular user, it was worth it. Firefox was honestly pretty shit on (at least on Windows) pre-e10s. Before, I had to kill firefox every few days because CPU usage would climb for no reason. Since then, the only restarts I do are for updates.

As a browser, it works much better, and as an extension developer as well, I'm glad I can write one extension that works in most browsers now.

Yeah, it sucks they removed the level of customization they used to have, but overall that changes are welcome.

That said, this whole thing is a huge weakness to me: having some organization decide what I can put onto my own computer is a frustrating tradeoff, but now it's gone from a nuisance to a real fucking problem.

I'm not an imbecile. I can manage my own browser plugins. Give me a switch to install XPIs without having some authority sign them, or at least verify once on install and piss off after that. I don't like my devices phoning home every 30s to make sure I'm "safe."


xpinstall.signatures.required in about:config is the switch you are asking for. Though, it does not allow one time checks at install, but a choice of regular checks, or no checks at all.


That switch is ignored everywhere but dev/nightly builds and on Linux builds.


It's not just performance that was the issue. XPCOM gives you access to basically the entirety of the browser internals. There's no designed API, the implementation is the API. Which means that you can't change the browser internals, unless you're willing to break addons, or you perform an audit to figure out which plugins will be broken and you get them to update first.

Basically all refactors took months and months and months because of this. There was no way to address the accumulating technical debt.


> Mobile's still broken.

The hotfix xpi link mentioned elsewhere in this thread works on mobile.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: