Unfortunately, as long as Firefox security infrastructure does not match Chrome, I can not use it. Note that errors like use-after-free and buffer overflows are marked as Critical (run arbitrary code on computer) for Firefox, but marked as High (execute code in the context of, or otherwise impersonate other origins.) for Chrome because of sandboxing.
Firefox sandboxing does not use Rust. Servo sandboxing (work in progress) does.
It's not accurate to say that Firefox has no sandbox. Sandboxing work has been rolling out for many months now, and later versions have increasingly restrictive sandboxes. You can see the current status per version here: https://wiki.mozilla.org/Security/Sandbox#Current_Status
Hardening the sandbox is a constant work in progress, but it's not as simple as "compromise of browser engine automatically means arbitrary code execution as running user" any longer.
Sorry I didn't mean Sandboxing used Rust, It was a separate thing (That Rust is less susceptible to such class of errors, so the more code written in Rust the better)
Also didn't mean it has no sandboxing (I knew about efforts), my point was its security is not a match for Chrome yet, as seen from the amount and type of critical errors.
When it is there I would definitely use it along with Chrome.
In case it wasn't clear, it is now the case that content process compromise is no longer critical unless combined with sandbox escape.
Looking at historical vulnerability reports can be misleading in this regard, as the sandboxing features are rather recent (pre-stable-version in some cases).
If I wasn't clear as well, As far as I can see from the link you gave It still does not match the level of security Chrome offers, it is getting there, but not yet.
As Mr. Patrick Walton over there said, Mozilla is still building up the exact, minimal file-system permissions. So, yes, in that way it is still not yet on quite the same level as Chrome.
In terms of security architecture, there is to my knowledge only one bigger difference left and you can change that, if you want.
The difference is that Chrome will spawn a new process for every new tab (unless the webpage in it is from the same domain as another tab). Firefox instead will always round-robin the tabs across a fixed number of processes to achieve lower RAM usage and as result of that also somewhat better performance.
But you can tell Firefox to round-robin across up to 1000 processes or what have you, so that it then does spawn a new process for every new tab (and therefore sandboxes each tab individually).
To do so, go into about:config and set "dom.ipc.processCount" to a high number, like 1000.
I guess as long as it is not default behavior it will stay as a weakness then? I am not sure if it is worth to exchange it for memory or performance gain.
Also is using thread per tab approach more susceptible to memory leaks, resource sharing issues or OS scheduling shenanigans than process per tab approach? You close the tab, the process is dead and resources are mostly guaranteed to returned. My knowledge of modern browsers is limited, maybe this was considered an acceptable compromise though.
Though the Firefox developers aren't going to skimp on sandboxing (or other exploit mitigation techniques) just because of Rust. Defense-in-depth is the name of the game; Rust just provides a layer of language-level defense that C++ previously didn't offer.
Well sorry, for that. Get a piece of paper and try it. The drake equation isn't too hard. It is just a bunch of probabilities multiplied by an amount of stars. You can even add your own terms if you don't like the one you have, but generally doing this takes yo further from the science...
Anyway, Here is a simple description I snagged from google search [1]:
> The Drake Equation is:
> N = R * fp * ne * fl * fi * fc * L
>
> where:
> N = The number of broadcasting civilizations.
> R = Average rate of formation of suitable stars (stars/year) in the Milky Way galaxy
> fp = Fraction of stars that form planets
> ne = Average number of habitable planets per star
> fl = Fraction of habitable planets (ne) where life emerges
> fi = Fraction of habitable planets with life where intelligent evolves
> fc = Fraction of planets with intelligent life capable of interstellar communication
> L = Years a civilization remains detectable
Now grab a pen and cocktail and make the math do whatever you want. I think that is you stick to realistic numbers you will get reasonably high values. But we might just disagree there.
My post had the formatting screwed up, but you didn't even try to show your work. You are just presuming what you think or want to be true without putting even the theoretical work into it.
We should only let people have one release of anything. That way there are no extra man hours spent on anything that some random guy on HN thinks is shitty.
The only way to prove something is not shitty is to release a monopoly and let other players show themselves in a fair competition. We'll wait for cross-platform assembler to be supported on every toster and see how long js can stand his 'beauty'.
All those other dynamic scripting languages? They're DOA because nobody's going to download the entire VM every time.
What about all those nice functional languages? Very problematic because the toolchain relies on LLVM-like semantics which don't like good garbage collectors or functional programming in general.
We then get down to C-like languages, Rust, and more esoteric languages (and promptly discard the esoteric ones for lack of a decent ecosystem).
Who in their right mind wants to write a front-end in C++ or Rust? By the time you get anything done, the web has changed and you're stuck with a pile of dated code that takes too much time and costs too much money to update.
The web had a shot at a decent language with Dart (it was/is even an ECMA standard). It didn't die because of other browsers. It died because of poor web dev adoption rates.
If only Eich had been allowed to implement scheme then none of this would have been an issue.
> All those other dynamic scripting languages? They're DOA because nobody's going to download the entire VM every time.
Hosted on a CDN in compact form, it'll be feasible; application can just link against it, it does not have to be recompiled on the user's machine.
> Very problematic because the toolchain relies on LLVM-like semantics which don't like good garbage collectors or functional programming in general.
That will indeed be a problem. The design restrictions imposed by the environment (continuous heap, emulated concurrency) will bite us in the ass hard. Instead of layering a lot of very leaky, performance killing "security" features over other half assed features, a better way would be to step back and use a tiny hypervisor. That way you have hardware accelerated memory safety, but it would require the application to be able to run in a microkernel; recent IncludeOS + ukvm has a boot time of 11 ms. (Great, now I want to write a plugin that embeds qemu in Chromium and uses a virtual device to communicate with the host.)
> Who in their right mind wants to write a front-end in C++ or Rust? By the time you get anything done, the web has changed and you're stuck with a pile of dated code that takes too much time and costs too much money to update.
This has been repeated again and again, but the proof that you are programming faster in JS than in Rust (or whatever language you want) is very lacking. At least, if you want to use your code longer than two days, because that's the time many of those "I just write it and it will work!" JS programs stop being readable. But even then I have my doubts. Sure, if you've never used a language before you will be slower, but that doesn't make the "JS can be programmed faster" assumption correct.
> Who in their right mind wants to write a front-end in C++ or Rust?
Every AAA Game Dev.
Java is also enabled by WASM and plenty of large applications are made in that. Things like Google Docs are made in Java and cross compiled to javascript as it is. Someone thought static typing and sane language semantics were worthwhile.
I think you are adding extra steps that are not needed. The programming model requires the JVM there are a few tools that compile Java directly to native code. There are tools that compile it to java script, and there are beta quality tools that compile it to WASM.
I'm not sure. Back when javascript performance sucked, there was a great deal less of it and websites seemed to work fine. You know, back when the text content was in the HTML. Now with everything rendered client side, there's more opportunity for things to go wrong. I would estimate I waste more time now.
You appear to be referring to the late 90s then? Maybe?
Because as soon as advertising kicked off JS usage did too. Even before then the DHTML movement had successfully moved JS into spaces people didn’t know it could be on the web.
You’re romanticizing an era I lived and worked through: it wasn’t really like that.
It was exactly like that, man. Dirt cheap hardware of these days could view web easily. The same is not true for today. Maybe you worked through specific areas or simply was tired to death and now deromanticizing it, because I clearly remember when things went downhill in general.
Ah China and AI.. Just remembered this nice piece of paper:
Automated Inference on Criminality using Face Images,
https://arxiv.org/pdf/1611.04135v2.pdf
At best they will become an Orwellian nightmare then they currently are, World domination? Please...
I suspect that he thought that it would be possible to work some change from the inside. Now that the optical disadvantages of being closer to Trump have clearly outweighed the advantages there is no other choice left than to resign.
> But Trump has been bedfellows with white supremacists since day one.
Yes, but there were people who thought that once he became president his true nature would come out and things would be better and that he was merely using these people as a tool to get elected. Now, he probably did use them as a tool to get elected but at the same time I suspect that his real sympathies lie with them rather than with the groups that would like to consider everybody equal before the law.
> All of a sudden there are "optical disadvantages" of being closer to him?
Well, there probably always were disadvantages but they are becoming rather more pronounced now. People are calling for boycotts of the products of companies standing with Trump and every action that Trump takes that forces them to either go all in or denounce him will cause a few to break off. It's surprising me that so many companies are still on board.
The arstechnica numbers surprise me, given the sheer difference between the Ryzen 7 1800X and the Threadripper 1920X compared with the 1920X and the 1950X; either you've reached more or less peak parallelism by 16 cores, or the larger L3 cache and quad-channel memory makes the difference. Or something's changed in the benchmark setup (be it the compiler or whatever) since the 1800X was done.
Actually its a problem with Visual studio, Ryzen utterly destroys comptetition when it comes to compilation. Unfortunately Anandtech does not test with GCC or Clang.
Firefox: https://www.mozilla.org/en-US/security/known-vulnerabilities...
Chrome: https://chromereleases.googleblog.com/search/label/Stable%20...
Hopefully with their sandboxing project done (Also with Rust) will make Firefox much better in the security department, but until then, no.