I think this whole thing is a bad idea. They won't allow a scary opt-out button because some software could turn it on, don't allow about:config because it could be flipped, don't allow a build flag in stable because malware could flip it and build. I mean... by that logic, couldn't someone reskin and redistribute alpha or the 'unbranded' with malware? Firefox is so afraid of malware, yet it encourages users to store their passwords in plain text in the browser?
It really feels like a landgrab and not anything anyone has been asking for. Firefox has to compete with the Chrome web store, but you can't burn your house down to save a room. What's left if Firefox becomes some centralized market akin to iOS, Windows Store, Play etc. Forced AMO signing flies in the face of the decentralized web. "Malware happened" and "Chrome does it" aren't really answers.
I think getting backed into a corner has made Moz take risks with their ideology. Hopefully they give developers and users a lot more time to deal with the upcoming changes.
> It really feels like a landgrab and not anything anyone has been asking for.
Benevolent DRM. Mozilla wants to control the software, even though users want to be in control. Plugins are the reason people always give for using Firefox. Justifying control by citing non-technical users is telling: users are to be managed and controlled; users aren't "us" but the other.
I understand why they are going walled garden. Non-technical users cannot be trusted to control their browser. But Mozilla should know if they do this then I, a technical user, won't use their browser.
I am not an extension developer but it's a rare month when I don't find myself popping open an .xpi to make changes to the JS and HTML for personal aesthetics; or bugs only I have. Temporarily loading extensions is not a solution. They say they'll be producing an "unbranded" non-walled garden version that for technical users but no such build has been revealed yet. And if they do make it the unbranded version won't be in my OS repos.
This walled garden, the addition of Adobe EME/DRM, the bundling of "Pocket", and the incoming drop of XUL based extensions are too much.
While I would agree that these changes are unfortunate for this specific use case, I do believe there are at least two workarounds for you:
1) Creating an AMO account and running the command line "jpm sign" tool yourself. This requires a bit of overhead per each new addon you want to make modifications to, but the actual signing of unlisted addons (which is entirely automated) had been fast and mostly painless in my experience.
2) Using Firefox Aurora/Developer Edition as your main browser and relying on its automatic update mechanism.
(I do not work for Mozilla so this is speculation):
As far as I can tell, side loaded extensions require a full, non-automated, review http://i.imgur.com/r070Grv.png and it wouldn't surprise me if side-loaded extensions were the worst offenders.
Upon discovering a malicious extension, Mozilla could look through all extensions they've signed and blacklist (https://addons.mozilla.org/en-US/firefox/blocked/) all extensions with a similar signature similar to how some anti-malware databases work.
There is probably not all that much stopping you from writing a malicious extension that passes the AMO automated review (example: https://addons.mozilla.org/en-US/firefox/blocked/i1058), but the cost for malware writers is going to be significantly higher since it will be far easier for Mozilla to shut them down via their blacklists.
I have used it more or less exclusively for a couple of months and it works nicely everywhere except for some pages with JS developers who manage to write "your browser is not supported" for anything but Chrome.
Excuse my ignorance, but is this official? I use FF Dev edition as my primary browser and it is very stable for me.
Context, if it helps: I am a web dev most of the time, and my browser usage is almost always >100 tabs or so. More than half of those have the devtools open all the time (each of which basically count as another instance of Firefox). My machine is pretty modest (i5, dual core, 8gigs RAM). I don't use many extensions though. Only a lightweight custom theme and greasemonkey + lots of custom userscripts.
Reason I use dev edition is because it has more devtools goodies and ES6 features. Nothing related to extension development.
Aurora is alpha, but FWIW, I've been using Nightly as my main browser for over 4 years now without ever losing data, so Aurora should be more than fine.
Yes. When Firefox first announced their plans for the walled garden back near FF37/38 I did as everyone said and tried Aurora.
It crashed on me twice in as many hours. I'm pretty hard on browsers since I typically have 200+ tabs loaded. That never happens with final release versions.
"Chrome did it" is not an excuse. I was pointing out that Chrome is even more restrictive. Google requires developers to distribute their add-ons through Google's store of approved add-ons.
That isn't true, you can install Developer add-ons worth going through Google (otherwise how can you develop them). You might get an annoying pop-up though.
There are good technical reasons for requiring signed add-ons. Well, maybe not so much "good" but necessary because of other bad things in Firefox that prevent a less extreme requirement from being implemented.
But the signing requirement isn't what upsets anyone. It's that add-ons must be signed _only by Mozilla_. The whole mess could have been avoided from the start by saying that add-ons must be signed by a trusted certificate but the end-user gets to choose what certificates are trusted.
I don't think that would work very well. After all, the list of certificates would have to be a preference, so crapware could just stick its own certificate into the preferences file before installing its addon. Of course, baking the requirement into the binary isn't perfect either, since the crapware can just patch or replace it if it has sufficient access rights, but I'd say it 'feels' drastic in a way changing preferences doesn't (which matters if the crapware is trying to be semi-legitimate), and it requires somewhat more work to come up with a suitable binary and keep it up to date.
Easy: just provide a certificate that cannot be changed within FF (must be modified by changing a file in the installation directory). No preference needed.
This would allow developers to roll their own, and organizations to allow their custom policy as well (which could mean that the privileges of the user himself aren't sufficient to change the cert).
There's really no reason as of why this couldn't be implemented properly. There' also no reason to not include a switch for it, really. This is all just smoke: most users have the privileges to change the FF binary, meaning that in most contextes malware has too (either through social manipulation or through exploits).
Because a third party app on Windows can edit the Firefox preferences file to set the opt-out preference and then install its unsigned malware/spying/ad-injecting extension and it'll be loaded the next time Firefox starts up with no warning to the user.
This is why Chrome hashes its settings files on Windows so that when any 3rd party app tries to mess with it, it wipes all extensions and extension settings and resets the homepage and search engine to the defaults. Unfortunately this also means that you can't move your Chrome settings to another PC as they'll all get reset. You have to sync to Google to ensure all your settings aren't wiped by a badware app or a corrupt byte in the Chrome settings file.
The security of the browser is conditional on the security of the platform in the first place, so this does not make sense. Anything that can edit the settings of the browser without its knowledge can interfere with the browser and other software in other bad ways.
It especially does not make sense for users who actually do have a reasonably secure platform, and these "security features" are then purely an annoyance.
Here's the thing, though. They work. It's far easier for a low to mid-tier bundleware company to build a basic browser extension that inserts ads or takes over the homepage/search engine using basic off-the-shelf components than it is for them to install a system-level networking component that intercepts and changes browser networking calls without breaking things. Google Chrome took a dual pronged approach to disallow all extensions except those in their online store an disallow third party changes to browser settings. So, bundleware can't easily install into Chrome. It can easily install into Firefox because you or anyone else on the PC can install whatever you want.
Just because this practice doesn't apply to or benefit you doesn't mean it doesn't apply to and benefit the majority of browser users. Remember, the majority of Firefox users don't even use extensions at all. Closing this hole would increase their security browser-wise.
Ironic Note: Google Chrome, while attempting to block bundleware from interfering with its own operation, is one of the most widely distributed bundleware apps. Installers that use dark patterns (tricking users to not notice they're installing a new default browser) from Oracle's Java to Adobe Flash to Avast to Antivir all are used to install Chrome onto systems (hopefully) without the user noticing.
Of course, Google Chrome itself is probably less malicious than many of the other bundleware. I wonder how many would have supported Firefox doing the same thing in 2004.
It's synced to the Google servers, but the hashes are within the settings file itself. I'm unsure if the code is public, though, as it is not part of Chromium.
The downside is that an unaware user that prefers not to sync their stuff to Google and fastidiously backs up all their user settings in Windows is going to lose everything they have stored in Chrome except their bookmarks and passwords if their computer dies or they try to move to a new one.
Can somebody please explain to me the purpose of this change. I read the blog post explaining the decision[0]. The crux of the argument was
>many tens of millions of users have non-hosted add-ons that were installed without their informed consent"
Why go thermonuclear and require add-on signing for everyone? Why not just make the add-on installation screen a little bit scarier. And if the concern is to make sure that people are installing what they think they're installing (i.e not something served by a man in the middle), then maybe just require that add-ons be downloaded from a site with HTTPS.
I just really don't see the point in this extreme choice.
people become 'blind' to scary screens. see: windows UAC dialogs.
i wish they would clearly indicate what problem it is that they're solving.
---
edit: after reading the post you linked, it's clear they're fighting against software installers that 'conveniently' install firefox addons.
for example you download skype, and it 'helpfully' installs an addon for firefox.
---
their solutions might solve the problem, but I think it goes too far.
Is it possible to require the user to approve and addons before they are active on the user's firefox install? even if they came from a 3rd party source?
Skype's "Click to Call" add-on is a good (bad?) example of the harm caused by "side-loaded" add-ons. It causes Firefox to crash, hang, or take multiple seconds to respond to mouse clicks. The add-on is quietly injected into Firefox by Skype's application installer and can't be disabled or uninstalled from Firefox's Add-on Manager. Users must uninstall it from the Windows' "Add/Remove Programs" Control Panel.
Android does it best - you can install non-play-store stuff, but you have to go into a scary menu and fiddle with settings. Better than an are you sure popup.
Because all apps are sandboxed on Android, it's hard for an app to fiddle with the settings. But desktop applications can fiddle with Firefox settings, so I can see why Mozilla don't want to allow even a setting.
I assumed the timer was there to prevent clicking Install by accident.
Popup dialogs may appear right "above" something else you were clicking on, or while you're typing; hitting Enter, or Escape, which then causes the dialog to eat the input and vanish.
Putting a dialog on cooldown like that would be interesting UX behavior. So if a dialog receives input < 1 sec after creation, it would discard the input and put itself on a cooldown timer, like that install dialog, or you could just change the context and avoid popups altoghether.
PS: I'd mark my post Off-Topic/Digression/UX if HN had that option.
Add-On Signing which can be automated does provide very little added security for the user, and just serves as a big pain the butt for everyone else involved.
I'm routinely questioned to do support for users, and I find malware extensions regularly installed in Chrome which has even stricter requirements. The other day I found a laptop with one extension that redirected google.com to a scraper. Chrome did EVEN include a drop-down warning just below the URL bar that the homepage was being currently redirected by an add-on. Did the user notice? Not a bit.
In fact, Chrome adding a WARNING for it made me feel even more sad: they know this is going on. Is FF going to do the same? If I was an user with an extension that did that for whatever reason I wanted, I would be furious as hell to see an added warning which I need to disable (IF possible), and that will be ignored by most users anyway.
Malware developers will just get by faster, so Mozilla will not be able to keep up with the list of addons to blacklist. At the same time, initial approvals for legitimate addons will get slower. Reporting issues with Addons is clearly insufficient as Chrome Store demonstrates. In fact, I'm also NOT ok with the idea that extensions can be blacklisted at all, in spite of that "added security" aura around it. By the same logic, if extensions are required to be signed, then blacklisting (and blacklisting updates) shouldn't be allowed to be disabled.
And, let's remind ourselves, that most of the extensions I've seen installed were side-loaded with other software (with "extras") that was installed in the system with the same privileges of the browsers. So really, I do expect malware to simply patch the FF binary to either disable the check or change the public key, or change it entirely with a patched version.
I do not support walled gardens of any kind.
Use a fork.
They have the EME build for people who don't want the DRM stuff so why can't they just offer a build with unsigned extensions allowed in a similar way?
Yes it gets messy but they are making it so. I don't want to use a non-release (i.e. possibly buggy) version just so I can sideload an unsigned extension.
Just offer a non-front-page build in the same way they do for EME and let's move on to more important things.
Seriously Mozilla waste so much time and energy discussing all this stuff when it is patently obvious what the right thing to do is.
The goal should be to make Firefox brand maximally safe for average user.
Recognizable name is important for non-technical users. Mozilla should take every effort to make browsing safe for anyone who uses browsers originating from mozilla.org. Even for average developers. There should be no way to trick people following instructions that disable add-on signing if they download and use something they know by the name "Firefox"
What we want is separate no-brand Firefox build that follows Firefox closely. It should work exactly like Firefox including updates (except when it explicitly deviates) but have no name recognition or easy association to Firefox/Mozilla.org. In theory it could be automatic build from Firefox development team as long as it's impossible to download it from mozilla.org and associate it with the same site as Firefox unless you know what you are doing. Any bug or security issue arising form the deviant version should not have Firefox in the news headline.
For the same reason Facebook puts a big loud warning in the developer console. People will follow any instructions they're given. "Press ctrl+shift+I and paste this in the box and you'll get a free puppy" "Put this in your address bar and your crush will be revealed" "Go to about:config and double click this thing, and then click this link and we'll show you nearby singles that want to hook up"
Firefox add-ons essentially have full, unrestricted access to your computer. Locking this down good and well is pretty important.
Non-technical users do not use sudo, but they do use a web browser. Do you think Facebook add this JavaScript console warning for no reason at all?
.d8888b. 888 888
d88P Y88b 888 888
Y88b. 888 888 This is a browser feature intended for
"Y888b. 888888 .d88b. 88888b. 888 developers. If someone told you to copy-paste
"Y88b. 888 d88""88b 888 "88b 888 something here to enable a Facebook feature
"888 888 888 888 888 888 Y8P or "hack" someone's account, it is a
Y88b d88P Y88b. Y88..88P 888 d88P scam and will give them access to your
"Y8888P" "Y888 "Y88P" 88888P" 888 Facebook account.
888
888
888
I think this is unnecessary, especially in the land of FLOSS licensed software where the developer disclaims any and all warranties.
Developers should focus on usability, and not on idiot-proofing software.
There is no way to guard against users installing malware themselves. No matter what kind of safeguards and check summing and signing you use for your application once a program has full access to a machine it can do anything, including bypass your safeties.
You can't fight user stupidity. In doing so developers do a disservice to their regular users. (The way Chorme prevents this issue is exactly an example of this because the app is no longer portable) No matter what kind of padding you add, stupid users will still manage to hurt themselves in the most unexpected and unimaginable ways.
I really despise this trend stared in the US and the rest of the western world where idiots sue companies for the effects of their own idiocy and this results in all kinds of redundant warnings on products that just serve to guard the manufacturer from stupid lawsuits.
We should not strive so much to go against natural selection. Darwin awards exist for a reason.
Firefox actually has additional protection against such attacks. Minor annoyance for developers (who may not even hit it if they use the console regularly), but helps mitigate such attacks quite a bit.
> Non-technical users do not use sudo, but they do use a web browser.
Your casual casting of a swath of the population as "non-technical" notwithstanding, the point is still sound: why do you think that it's worth gutting this feature as a safeguard against someone being fooled into navigating to "about:config" but not worth removing sudo for the same reason?
If someone can be persuaded to abuse "about:config", why not sudo?
90% of web users are on Windows, where there is no sudo. Malicious add-ons make money by injecting ads, overriding default search engine settings, capturing login credentials or even local files, or installing zombie spam relays. sudo is unnecessary for these attacks. How does one make money with sudo?
And as for locking down sudo, OS X is now "rootless" (System Integrity Protection) by default, preventing even sudo access from modifying some system settings.
> 90% of web users are on Windows, where there is no sudo.
This argument is becoming increasingly specious.
Firefox is the default browser on Ubuntu, where there is sudo. So do you acknowledge that it is consistent to keep this preference in at least the linux version of FF?
Oh you're right! We need EME'd web assembly so Facebook can hide everything behind a proprietary binary blob. THEN the user will really be free from themselves and their own stupidity. \s
If software could reliably know when the setting was changed--
--It could warn the user: Hey, isn't it convenient that this check was disabled just before this addon was about to be installed?
Perhaps a three strikes approach, where, if you disabled the warning, you'd still have to suffer through the warning three more times. You'd need a reliable way to store the state to avoid tampering.
> You'd need a reliable way to store the state to avoid tampering.
It's the same problem. If the user can set it, then it can also be set by malware running at the same privilege as the user.
The only solution is to move the check outside of this privilege level (i.e. the readonly `.text` section of the firefox binary in /usr/ or Program Files). That's what the signing requirement does.
But if malware is running at the same privilege as the user, I have all sorts of more serious problems. Why is this worth stripping me of sovereignty over my browser?
I switched to ESR channel (38.5.2) but I don't know how I'm going to use a newer ESR when it moves to a broken release. Too many extensions stop working and XUL extensions are the essence of Firefox. Tried using Chrome but so much of the ux is different or incomplete to a Firefox user that it's impossible to feel at home.
It really feels like a landgrab and not anything anyone has been asking for. Firefox has to compete with the Chrome web store, but you can't burn your house down to save a room. What's left if Firefox becomes some centralized market akin to iOS, Windows Store, Play etc. Forced AMO signing flies in the face of the decentralized web. "Malware happened" and "Chrome does it" aren't really answers.
I think getting backed into a corner has made Moz take risks with their ideology. Hopefully they give developers and users a lot more time to deal with the upcoming changes.