Hacker News new | past | comments | ask | show | jobs | submit login
Transitioning Firefox's rendering engine from Gecko to Servo (jensimmons.com)
461 points by bpierre on Jan 5, 2017 | hide | past | favorite | 213 comments



I've seen this before:

>> At some point in mid-2017, all new CSS will be built with Quantum plane parts, not Gecko. Going forward, it will be much easier and more enjoyable to implement new CSS properties. Which makes it easier for folks in the open source community to contribute.

If you're not careful this can drag on for years with half the stuff done one way and half the new way - especially once you reach the point where "all new stuff is built the new way". There can come a point where you need to just purge the old stuff. We've seen this in places like GIMP moving to GEGL, and several project moving to GTK3 (GIMP, Inkscape, and others), the purge of java from LibreOffice, and now there's this mushy migration to Wayland. Firefox is not the only project migrating to rust either.

In all these cases I'm actually in favor of the migrations, and many of those have been completed over years. It's just that once a path has been defined I feel like there needs to be a more concerted effort to actually complete the transition sooner rather than later. Let new features take a back seat and get your internals switched over. I know it's a balance but working on a project that's straddling the fence is almost always slower in the long run.

Having said that, firefox is huge so subsystems will need to switch to rust one at a time. I'm just saying that once CSS can be done the new way they need to require it to all be done that way, not just new stuff. Perhaps that is the case and it wasn't spelled out clearly in TFA.


A good way to force this along is to have a champion for the work, and an automated burndown metric that random devs can watch and influence.

Mozilla is pretty good at the latter, cf. https://arewefastyet.com


Seems like Safari is doing better than everything else if I'm reading the charts right.


> Why is Safari not tested on 32-bit machine?

>Safari defaults to 64-bit machines and doesn't need to worry about 32-bit anymore. Big pieces of their engine is 64-bit only. As a result showing Safari on 32-bit machines would give incorrect results.

Maybe this has something to do with it?


I don't think so. It's at the bottom for "Sunspider execution time" and "Octane score", but I would expect execution time to be "lower is better" (so Safari is best), but a score to be "higher is better" (so Safari is worst).


On "Octane score" you can see the vertical axis reads 0 at the top and 40000 at the bottom, so it seems it's always "lower is better" across the three graphs, which also means Safari is the fastest browser, if you consider just these metrics. I'm actually surprised...


One big problem I find with these transforms is there always comes a point where something, for a while, has to get a bit worse. Either a bit slower, or some features disappear.

Now I try to ensure we agree up front which issues will be blocking, and how much extra memory / CPU is acceptable, which features have to be kept, etc.


> If you're not careful this can drag on for years with half the stuff done

Exactly. I would suggest to focus on Servo and a HTML based UI, and take some ideas from Vivaldi browser (which has an HTML5 UI). Firefox needs a "reboot", like Firefox was a leaner reboot of Mozilla Suite - but as Mozilla Suite back then, Firefox got too bloated.


But... Firefox itself was an example of a different approach to a transition, it started as a re-implementation of a clean, simple browser when Mozilla was getting pretty stale. The same approach could have been taken here... Firefox does feel a bit long in the tooth to me these days...


> Going forward, it will be much easier and more enjoyable to implement new CSS properties. Which makes it easier for folks in the open source community to contribute. Which makes new things come out faster.

I had never really thought of Servo in these terms before, but it makes a lot of sense.

Mozilla, being a nonprofit, can't compete with Google / Apple / Microsoft doing things their way. They inherently have a greater need for the free-time contributions.

Given that, if you can make contributions easier and "fun", by choosing a more modern language and toolchain, it could theoretically be a competitive advantage.

Thinking about it like this, the recent Rust marketing articles make a lot of sense. Rust marketing matters because the more Rust developers there are, the more potential Servo (and thus Firefox) developers there are.

Whether it pans out or not remains to be seen, but as a way to attempt to hack a problem, I really like it.


I really love seeing new folks get on board with Servo itself, but there's something else that's equally important: the Cargo ecosystem. Servo is built with Cargo and incorporates a lot of code from the Rust community. Likewise, we try to make our standalone Servo pieces available as Cargo packages on crates.io when we can. We take code from the broader Rust community and give back in turn.

In this way, even folks who have no intention of hacking on Servo can effectively end up contributing to the project. And, by the same token, we also in effect contribute to lots of projects in the Rust community.


Note that most of the recent Rust "marketing" articles have nothing to do with Mozilla and weren't a coordinated effort (https://news.ycombinator.com/item?id=13325196), it just sort of happened.

This specific article is by Mozilla from the Developer Relations team, so you might call it marketing, but there hasn't been such an article from Mozilla for quite a while now.

But yeah, I agree, Rust will make it easier to contribute to Firefox IMO. We have a lot of contributors on Servo.


Mozilla is really just another software company owned by a nonprofit. They have hundreds of millions in revenue and have been "profitable" in many/all years. But they are definitely much smaller than Google/Apple/MS and can't benefit from ecosystem lock-in effects.


> Mozilla is really just another software company owned by a nonprofit

It depends on how one defines "just another software company", but I don't agree with that characterization:

Our mission is to ensure the Internet is a global public resource, open and accessible to all. An Internet that truly puts people first, where individuals can shape their own experience and are empowered, safe and independent.

Much more of the same here: https://www.mozilla.org/en-US/mission/

I don't see Google, Facebook, Microsoft, or other software companies aggressively, creatively pursuing those goals (and achieving them). If Mozilla was just another software company, would there even be an open web today?


Everything's comparative, of course. Mozilla isn't a solo dev's weekend project, but it's also not a top 10 most valuable company in the world, like all of its competitors are.


Also, Mozilla, the way it is setup, does not have the same incentives as its competitors do, all of which are publicly traded firms.


> And Firefox runs on Gecko. Which is old. Like really old. I think it's the oldest rendering engine still in wide use.

Gecko may be oldest, but not with wide margin: KHTML (the origin of webkit and blink) was started in 1998, only a year after Gecko. Of course, probably not much (if any?) of the original KHTML source is still present in webkit, and because of the nature of open source development, there is blurry line between projects.

I actually ran KDE 1.0 as my main DE and used KHTML as my primary browser (falling back to Netscape when particular site was incompatible). I did immediately appreciate the coherence & elegance of KDE. It lasted for several years before OS X came and finally won me. Oh, those were times.


One could argue that Trident (or EdgeHTML as partial fork) is still in wide use, and this started with IE4 in 1997.


You mean IE3. But the code dates back to Mosaic, IE1 is based on a fork of Mosaic. And you can read about it in the credits in the Info dialog of IE up to IE6 (and even all later IE and Edge contain anomalies/small glitches from Mosaic that are still present up to this day, nevertheless the code got refactored a lot and Mosaic hasn't been mentioned in IE7+.


> and even all later IE and Edge contain anomalies/small glitches from Mosaic that are still present up to this day

Have any resources documenting this? I'd be curious to find the building blocks of an old empire in some modern skyscrapers.


Start there: https://en.wikipedia.org/wiki/Internet_Explorer#History

There are various sites that had article about early IE history. And I suggest you to install Mosaic browser on Win7 32bit and try it out yourself - you will see how familiar almost everything is, if you know IE from everyday usage. Not just familiar, but also little things, traits and glitches are still in todays IE and Edge.


> KHTML (the origin of webkit and blink) was started in 1998, only a year after Gecko. Of course, probably not much (if any?) of the original KHTML source is still present in webkit

At least this part is still there: http://www.hanshq.net/roman-numerals.html#khtml


Original Opera engine started somewhere in 1993, last update to Opera Classics was year ago.


You are still on KDE 1?


Maybe because upgrading would require a reboot. :)


No, "run" is Past Simple here. It took me a few moments to get that though.


Is that an Americanism? Ran is most common in the UK.


I think they omitted 'had' from run. The past tense of run is 'had run', 'was running' or 'ran' depending on how you want to use it after all.


Thanks for correcting, I meant the past tense, not a native English speaker, fixed run -> ran.


Yaggo had 'run' originally and has since edited the comment to read 'ran'.


Ran is correct. I suspect english is not their first language.


Not a chance. This is something we can agree on. :)


I remember using Konqueror+KHTML much more recently than KDE 1.0 too :-). At one time it worked quite well.


There is really interesting podcast episode about Rust and Servo https://changelog.com/podcast/228 with https://en.wikipedia.org/wiki/Jack_Moffitt

Definitely worth listening to, tons of info there.

from show notes interesting presentation: https://docs.google.com/presentation/d/1-FSfNO-oT9Wqo2swvm6U...


I'm so happy to read this and all I can say is I hope they succeed - the faster, the better. I used to run Firefox and loved it so much, but it started falling behind in performance so much that I had to switch out of necessity. I haven't really seen it improve much over the years. On the contrary, Safari, while it always felt a bit slower than Chrome, is feeling faster and faster with each update.

Working at a company where I occasionally create presentational sites, I often have to create and optimise animations that some designer thought would be so very radical and the two browsers that are extremely painful to support are usually Firefox and Internet Edge-splorer (pardon the pun).

I have animations that run at a smooth 60fps in Chrome and Safari effortlessly, quite often producing 1-2 FPS in Firefox. Their updates are quite weird too, between the last 3 updates, I've seen the project I was working on run bad enough that I had to fallback to the mobile-lite version on FF to running acceptably (30+ fps) to running like crap again.

And maybe we'll see 16 year old style rendering bugs get fixed finally, too? Please, pretty, please?

After ripping on it a bit, tho, I have to say: I'm looking forward to running Firefox again in the future, as much as I like Safari at the moment, it's closed source and I don't like being exposed to Apple's whims (especially lately), where if they decide to destroy by making it unusable I'm left out to dry and quite frankly Google Chrome just creeps me out now.


> I have animations that run at a smooth 60fps in Chrome and Safari effortlessly, quite often producing 1-2 FPS in Firefox.

Can you give an example? I'm sure Mozilla's animation people would love to hear about this.


Is this an announcement of a change in strategy? I'm a big fan of rust and Servo, but my understanding was that it's still "experimental" and they're learning stuff about concurrent rendering and stuff like that, with no concrete plan to use Servo in Firefox (or make Servo a standalone, supported browser). I know there's a few odds and ends in Firefox that use rust, but I thought the future of Servo was still ambiguous. Am I (was I) wrong about that?

But it sounds like now the official plan is to get Servo into Firefox. Great!


It's called Project Quantum, they announced it a few months back, as a formal project to integrate Servo components into Gecko.


Ah, great. Must have missed that announcement. Thanks! I noticed the blog post using that term but didn't realize it was from an old announcement.


I think calling Servo "experimental" is still somewhat confusing as it stands today. I think the initiatives to implement engine in Rust is no longer an experiment. The language has gained some serious production attractions in the last year or so as Rust reached 1.0. While Rust isn't as "popular" as Go, Rust community is strong.

I think however it is better off to interpret that Mozilla is getting more serious on replacing many parts of Firefox in Rust. Mozilla already started doing that last year [1].

[1]: https://hacks.mozilla.org/2016/07/shipping-rust-in-firefox/


While writing a browser in Rust might not be, per-se, particularly experimental anymore, nobody has written a browser engine from scratch since KHTML in 1998, and with the growth in complexity of the web since then it's unclear whether one can write a new browser engine from scratch without spending years reverse-engineering the competition—we've come a long way in the past decade in better defining many historically undefined fundamental parts of the platform, particularly in HTML5 and CSS2.1. At the same time, there are still parts of the platform that remain largely undefined.

Apple's work on WebKit for several years after forking KHTML was essentially just reverse-engineering IE and fixing site-compatibility bugs; Opera had maybe 10% of all people working on Presto up until the end largely reverse-engineering the competition and fixing site-compatibility bugs; etc., etc., etc.

That's not even the only interesting question: nobody has tried to parallelise layout before, and how does that interact with things like the Grid Module? How about Houdini?


> it's unclear whether one can write a new browser engine from scratch without spending years reverse-engineering the competition

That's to Mozilla's advantage, given that no company in the world has more historical experience reverse-engineering browsers. :P

> how does that interact with things like the Grid Module?

Servo devs have definitely identified places in the HTML spec which, entirely by accident, require serial behavior. Having a pervasively-parallel browser will hopefully go a long way towards preventing such accidents in the future.


> That's to Mozilla's advantage, given that no company in the world has more historical experience reverse-engineering browsers. :P

I'd claim Opera (or rather the consortium that now own the Opera browser) have the most experience of reverse-engineering browsers, given Firefox's market-share solved a lot of the site-compat issues a long time before Opera gained market-share (oh, wait, it never did significantly after the rise of IE). On the other hand, almost nobody from the old Presto team is still at Opera, so… ;P


Wow, I think the comments section below was ugly.

I have been worried about Firefox and seriously questioned some of the decisions along the road but I think and hope I always respected the devs who volunteer for this.


Call caniuse, flexbugs, quirksmode, stackoverflow and it's companion codepen on the deck, there will be some action!

creating a renderer in 90% of the work, the other 90% is documenting its bugs and quirks.


Correct. That's why it's such a massive task. Servo is pretty decent at supporting a lot of web standards now. Unfortunately, the real web is made of quirks and implementation targeting workarounds.

I think the biggest lesson from Servo is that entry barrier to the web market is already much higher than we would like it to be.

Having a great new web engine supporting 100% of web standards does not get you even close to being ready to serve the web.

And the willingness of web developer community to take the "one browser only feature" bait is a major part of the problem.


I for one hope we're getting better at that: we're sharing more tests across browsers than ever before, and I think that's quite possibly going to end up being the de-facto norm for all new tests this year.

Of course, that doesn't mean that we'll get rid of all bugs, but it does mean that more bugs should be found before shipping.

(As a disclaimer: this is essentially the majority of what I've got paid to do over the past year-and-a-bit.)


Good to see some internal motivation from Mozilla's side for a bridge between rust and c++. This will be needed to drive major adaptation of rust.


It will definitely be welcome to see the fairly aging Gecko replaced. The nightly build of servo barely renders a number of websites and won't at all for more media laden sites (as you'd expect). I hope this merge via Quantum[1] will be forthcoming soon, so we can test it.

[1] https://wiki.mozilla.org/Quantum


Wouldn't it be easier to build a new plane around the new engine, do some flight tests and after it's ready transfer all passengers?


That has two problems:

1) A lot of flight tests can't be done without a large fraction of passengers on the plane.

2) It may make getting to the final end state faster (though maybe not), but it means you don't get any benefits until you make the switch. Doing things incrementally means you start seeing benefits much earlier. Classic throughput/latency tradeoff.


What's better to end up in the final end state later, or lose customers because your software is crashing and is harder to maintain because you're introducing totally new way to do stuff?


That's a leading question and a false dillema.


I'm not sure what sort of distinction you are trying to make here. Again, the two options are:

1) One big switch. Can probably be done faster, but has more risk that big problems won't be discovered until late in the game and has user-visible benefits until the switch happens.

2) Incremental changes. Take longer to complete, but allow for better mid-course correction and can show user-visible benefits much earlier in the process.

Which of those will lose more customers? It really depends on the market reality and at how successfully the incremental changes can be made.


Even though the title might disappoint, it's super interesting to see where Firefox is going. I can't wait for 2017!


Looking forward to it too; this is a big deal because it will put a Rust implementation on many millions of desktops. I'm totally fed up with CPU and memory hogging browser bloat on my laptop. My dev env runs so much better when I kill all Chrome processes. I know that the article is only talking about the render engine, and not JavaScript, which I suspect is the main browser bloat culprit , but I'm still hopeful. Bring on the Rust JS engine I say!


It's not going to change that. No matter how efficient you make the JS engine, crappy JS will still steal all your ressouces.


Yes. However, a crappy, leaky, JS engine running crappy JS code will steal all resources even quicker. I hope a JS engine built with Rust smart ptrs will be less leaky than one built in C/C++. In my experience, hand rolled memory pooling systems can be very leaky.


We have extensive leak checking in our testing for Firefox, so hopefully the JS engine is not "very leaky".

One part of Quantum, which does not have anything to do with Rust, is to figure out how we can spend less time running unimportant JS (like maybe in a background tab), which will hopefully reduce CPU usage.


There's no plan to write a new JS engine at Mozilla, FWIW.


And they shouldn't...let's move forward... to WASM (Web Assembly)!

Which, also posted on front page of HN recently, a user's medium blog on compiling Rust to WASM. https://medium.com/@chicoxyzzy/compiling-rust-to-webassembly...


Except, like, JS isn't going to die any time soon, and realistically, implementing a new JS or WASM VM isn't actually that interesting to do: the big risk in a JITing VM that Rust cannot save you from is behaviour of your JITed code, which makes memory safety far harder to enforce.


Right, JS isn't going anywhere. But once WASM is able to run everywhere, anybody attempting to make high performance web apps would ideally want to write it in WASM. A lot of what JS does on the web is plenty fast and won't really benefit much from JS being multi-threaded/faster.

Sites using JS to run simple scripts or to fire api calls/etc. will continue to use JS, no reason to use WASM.

Also, I believe WASM is truly compiled code and not JIT ran? I may be wrong...but pretty sure that was the whole point of creating it. Native app performance in the browser... that will take a big bite out of the app stores that take a good cut of revenue and control what can and can not be in the store.

And sure, WASM can also have memory safety problems, especially with C/C++ being the primary languages to target WASM from. But that's also where Rust would shine, making it much more safe when compiled to WASM.


WASM isn't machine code: if it were, it wouldn't be universal which is a fundamental requirement of the web. (Well, okay, it's compiled code—it is, after all, compiled from C/C++/etc., but in the same way JS is compiled to byte code in VMs today.) As such, there's a further compilation step needed from WASM to machine code to get near-native performance out of it.

The existing WASM implementations do a mixture of JITing code at load-time and JITing code at first-call, depending on the size of the WASM blob, AFAIK. The gain performance wise over asm.js is primarily in parse-time, as far as I'm aware.

WASM VM itself is perfectly possible to be memory safe: WASM code cannot read outside of the memory allocated to it (and malloc/free are implemented on top of that memory allocated to the VM, hence there's memory safety at the VM level).

The big problems come when you need to guarantee your JIT code doesn't violate memory safety, and that's something Rust's type system cannot (currently) solve. You need guarantees that the generated code will never have any memory access errors, and will never race for memory reads/writes, because it's running with the privilege of the VM, not the limited powers of the WASM code running within it.


I wonder if it would be possible to write a JS engine in Rust that would essentially make multi-threading JS possible... Use Rust's concurrency safety to pass resources around threads running JS. Doing this without making any changes to JS. Of course Node would then also be obsolete...


The challenges in parallelizing JS have very little to do with its implementation language and almost everything to do with JS itself. Rust can't help you there.


I consider the lack of parallelism[0] in JavaScript as a feature of JavaScript.

[0] Note: parallelism is not the same as concurrency.


JS does have proper parallelism, workers can run on separate threads and could be run on separate processors. There aren't as many primitives for dealing with cross-worker communication, but some exist and it's usable.


JavaScript does not have parallelism!

Browsers implement web workers and a "system call" which lets users initialize them through JS. A different implementation/deployment environment like Node does not have web workers. Through native modules, Node can have parallelism, but this is not a part of the language and Node isn't subject to implement web workers to be considered "running JavaScript". However, regardless of this JavaScript continues to have concurrency without parallelism. It would have been more accurate for my last message to have said: "I consider the lack of parallelism in JavaScript a feature given that it continues to have concurrency by default".

Similarly, I'd argue that C does not have parallelism (just system calls ...), but C doesn't even have concurrency. As such I do not consider the lack of parallelism a feature of C.


I guess. I was thinking from the POV of web browsers.


Js engines are really fast. The stuff they run is bad.


Yes, nowadays I think it is a fact of life like death, taxes etc.


I'm hoping it spurs a resurgence in real desktop apps, then we shouldn't have to worry about bloated javascript.


I think people forget the rather overwhelming amount of crappy bloated desktop apps we had (and still have). There isn't anything inheriently bad with JavaScript. Webapps have strong points, and weak points, and should be used where it makes sense.

But the developers developing bloated code are most visible in the last shiny thing, which happens to be web and JavaScript. Before that it was mobile apps.


Rust is neither faster nor more efficient than C++.... its benefits lie elsewhere.


It's more subtle than that. Different languages promote different idioms, in particular via their standard library. C++ is particularly unusual in that you don't get much out of the box, and in older projects even things like strings and containers are often custom-built.

The combination of language and library provide affordances that make some idioms easier to represent, and others harder. This influences the design of code written in the language. Clumsy idioms generally don't get far, even if they would be more performant or provide some other benefits, like promote easier concurrency. And sometimes there's a tension: e.g. mutable structures tend to be more performant but harder to add parallelism to, and harder to reason about (leading to more copying than strictly necessary). Rust in particular has different ideas here.

I think the jury is still out on whether Rust will in practice tend to promote code that is faster or slower than C++. Of course any given project in Rust could be rewritten in C++ to be just as fast, or perhaps faster. But that's not what happens in the real world. Human factors matter, path dependence matters, affordances matter.


While literally true, Rust's benefits make it easier to write safe, parallel code, which often directly translate into greater speed and efficiency.


A big part of Rust is that it lets you very easily parallelize things, even in super complicated situations, because it keeps it safe. The corresponding task might certainly be possible in C++, but not necessarily always maintainable. Doing Servo's parallel styling or layout in C++ would be very hard.

So it doesn't produce more efficient machine code, but it does open up new opportunities for optimization at a higher level.


Rust should be just as fast as C++ and C. Benchmarks are hard of course because optimizations and such. But if Rust is significantly slower, than it is a bug.


I think you may have misunderstood my comment, as I did not mean to disparage Rust's speed nor imply it was slower than C/C++. Rust, at best, can match C/C++. But Rust cannot be faster than C/C++; therefore, by function of simply being rewritten in rust (and not because it was rewritten), that does not, in and of itself, make the resulting executables faster.


There is no proof that any existing C/C++ implementation can produce code that is globally optimal. I wouldn't say that it is impossible for Rust to beat C/C++...there is plenty of room for improvement in compiler optimization engines.

I would even say that the room for improvement is potentially greater for rust than it is for C/C++, due to the type system. It hasn't been taken advantage of much yet, but MIR is one of the first steps to do so. It's an intermediate stage representation, like llvm's IR, but with all type resolution intact. There are plenty of known and even formally proven code optimizations that can only be used if certain guarantees are proven (such as the guarantees made by a strong type system).


Rust _can_ be faster than C or C++. The possibility is different than always, though.


I hope it pans out and doesn't become the next "java will catch up to c one day".


Unclear on the audience for this post. All the "jet engine" stuff sounds like it's aimed at consumers, but this is of no interest to average consumers; instead, shouldn't this be of interest to devs who build web apps or embed rendering engines?


Note to author, please change your font, It's really hard to read on my screen. http://i.imgur.com/C9RdFHv.png Windows 10 + Chrome.


I am very happy seeing C/C++ getting replaced by something more modern and safer as Rust

(though the debug version in Rust is probably slow - that will be something interesting to see)


Curious about the results. Most of luck to the team.


They are still using many unsafe blocks and there was an use-after-free before :P


Yes. But the point is any unsafe code can be reviewed separately and be subject to more stringent checking than the majority of code which is safe.

I've been looking at the HTML parser they are writing (servo/html5ever) which is incredibly impressive and parses the hideous Daily Mail homepage in my tests in < 30ms. It uses the tendril library which contains lots and lots of unsafe code, but this library is set out to do a very specific thing which is make parsing strings much much more efficient. This is a design tradeoff that is desirable; as long as tendril can be shown to be robust it's a good decision over a slower parser.


The use after free basically had to do with spidermonkey, which has a GC, and is not 100% safe to use in Servo (we try to make it safe).

The quantum projects don't involve moving Servo's DOM into Firefox (they do involve implementing lessons learned from Servo's DOM, but not the DOM itself), so this isn't an issue.


Did the comment section on this website look weird to anyone else?


Makes me wonder what's going to happen to Presto now that the Chinese bought Opera ASA...


Presto was only still being used in Opera Mini, it had been swapped out for Blink in the desktop and full-fat mobile versions.


And in principle is still supported in some embedded devices, AIUI.


I know, but I'm replying from a Presto-powered desktop browser :) .

People have been asking for open-sourcing of it ever since the switch happened, but I guess we'll never see that happen.


Realistically, at this point, I'd say it won't happen. I wouldn't have ruled out Opera Software ASA open-sourcing it once it was a mere historical curiosity, but with it having gone to the new owners of the consumer browser business, I can't see it happening.


What do folks think of internetisms that the author uses like "Because Mor Better Ideas™."?

I see a number of engineers use broken phrasing like that in posts and presentations and it comes off really badly to others not in on the "joke". Does anyone really find that sort of thing actually funny or additive or is it the equivalent of trite office humor gone online?


It's a bit like sarcasm, even if less grumpy. But unlike conventional sarcasm, which is both best (as in enjoyable to the type of person who enjoys sarcasm) and worst (as in completely defeating communication for those who don't) when it is borderline unrecognizable, these "internetisms" don't need to play hide and seek to be good at what they are. In the servo post, extra capitalization and an ironic "™" is used for tagging, but if one ever felt the need to further reduce subtlety, any amount of emojis or "funny" typography could be added without diminishing any quality it might have (or exacerbating lack of quality, depending on where you stand). What I try to establish: amongst colloquial witticisms, this pattern is probably one of the most harmless ones.

Do I personally like it? I sure would not say that I love it, but I do enjoy reading texts that are less impersonal than a press release. Patterns like this can help identifying them, so I tend to be happy when I see them. What I hate, and I feel tempted to claim that all of humanity is with me in this, is when these patterns are used to engineer press release type communication into appearing like something different.


I think it shows who the author had in mind while writing the piece. If you use memes etc. then you are writing for an 'in crowd' that understands the same cultural references as you. That's OK, but you should be clearly aware of it.

The one thing I would have appreciated from the post is for the author to have written about the "Mor Better Ideas" that make Servo better than Gecko.

To me that particular phrase in this particular post is a little out of place and lowers the tone slightly. I expect that to others it will be seen as fun/human and indicating that the post isn't written by a corporate drone.

YMMV


I enjoy reading words which indicate the person who wrote them is not entirely dead inside. It provides, i feel, for a Mor Better Reading Experience™


I'll be dead on the outside eventually as Grim Death's grasping claw squeezes the life out of me. Please, my time on earth is precious and I would rather spend it with my loved ones in preference to skimming over forced references.


Is it a reference? If it is I did not get it, but I also did not miss it at all because I felt that it was quite clearly established as a shorthand for "improvements in the state of the art of writing software in general and specifically browsers, accumulated over the past two decades, which I don't want to describe here in detail, trust me that they exist or not, your choice" on first use. That shorthand was a considerable reading speedup for me, reference or not (if it is a reference, depending on what it references it might make reading slower for those who do get it - maybe I was just lucky?).


Oh yeah, that 0.5s you lost while reading that totally ruined your day, enough to make you waste even more time to comment on it.

It's an informal blog, I read the same reference, thought, "must be some joke I've missed", and got on with reading the rest of the blog.


Will you lie on your deathbed recalling all the times you spend reading forced memes and thinking, "… for teh win!!1"?

That blog is informal in style but not in content; it's a professional blog by a Mozilla employee. I don't feel like it respects its audiences' time, which is sadly the norm for software-related technical writing. Whings won't get better by saying nothing, and asking people to improve is never a waste of time (if done nicely).

Related: Normalization of deviance in software: how broken practices become standard[0]

[0]: http://danluu.com/wat/


*Things (whoops)


Contrast "this should be done better" against "this should be done Mor Better". The intention, I think, is to be less abrasive and "something else". Really I'm not even sure how to describe it, which I think is actually why people use such formulations; it does add at least some connotative meaning which is hard to express explicitly.

Basically, I do find it "additive", but it's obviously only feasible in very informal contexts and the percentage of the audience for whom it is not additive (and possibly quite jarring) might be quite high.


I've been annoyed by the capitalization and (TM) thing since I first started seeing it on slashdot (maybe it was fish and chips then) in the late 90s. But live and let live. At least we've dropped the micro$oft thing.


It's annoying and unprofessional. Stop trying to be cutesy and present the information in a way that people who are busy can consume quickly and make decisions on.

Is this the tone of the whole project? I'd love to contribute, but I don't want to wade through a reddit-comment-thread-level of forced references and memes to do that.

I get the reason for doing it, and I like that the internet isn't full of drab corporate-speak all the time, but this piece just missed the mark I think. The title doesn't really communicate the intent well and instead feels like its trying to be a little clever for the sake of being clever. It's really a two or three sentence press release that got fluffed up with some non-corporate language.


> I'd love to contribute, but I don't want to wade through a reddit-comment-thread-level of forced references and memes to do that.

I'm going to have call you out on this. You have no wishes to contribute. Because asking this question reveals you haven't even bothered to look at the repositories, join IRC channels, or subscribe to the mailing lists.

You're here to complain and you're using hypothetical constructive motivations to masquerade that fact. State your opinion without the fussy lie, there was really nothing wrong with it.


I love this comment.

The term for the grandparent comment is concern trolling.


>It's annoying and unprofessional. Stop trying to be cutesy and present the information in a way that people who are busy can consume quickly and make decisions on.

The world in general, and personal blogs in particular, doesn't exist to serve people who are "busy".

And being flippant on one's blog has nothing to do with being "unprofessional".

In fact, the blog where TFA is from is not a business or professional endeavor to begin with. So "professionalism" (however some provincial people conceive it) is not a requirement for it.

Not to mention that billions are made in fully professional endeavors that actually include such references and humor (from Google's easter eggs to funny messages in update notes in various apps, etc).


I think a lot of engineers may be hesitant about appearing boring in prose, because they haven't had a lot of experience/training in writing. Also, they not be confident that external people will be interested in the project. They try to build/maintain interesting by sprinkling it with these asides, forced references, and analogies (aping the style of other popular writers/entertainers who unfortunately aren't presenting technical information in a professional environment). They're probably aware of the drab corporate-speak & aggressively alter their style away from it avoid being mistaking for it.

Writing clear, expressive prose is hard! Plus, it's not taught as part of software engineering & it's not required to be a great engineer.


> Is this the tone of the whole project?

Absolutely not.


The submission title has been changed from "Replacing the Jet Engine While Still Flying" to "Mozilla’s project to replace Gecko with Servo" (at the time of this writing).

Some of the other comments are pertaining to the "Jet Engine" title. For anybody that might be confused, like I was :)


Unfortunately the new title seems to be misleading in a different way. There is no "project to replace gecko with servo". There is a project to advance gecko so it is state of the art in browser engines. Much of the current state of the art has been developed in Servo, and the natural way to bring that technology to gecko is to integrate the components from servo directly. This has other nice properties like increasing the Rust-to-C++ ratio, which we believe will be a win for safety and maintainability. But it is nevertheless components, rather than the whole shebang. Servo, for example, only just got document.write support and still (as far as I know) doesn't have a correct representation of document history. So it is some way from being a production grade engine for use on the general web.


Ok, we've taken another shot at a more accurate title by saying "transitioning" instead of "replacing", as suggested here: https://news.ycombinator.com/item?id=13328415. If anyone suggests a better title, we can change it again.


Well, you can write individual components in Rust and put them into Servo and then refactor the existing C++ to make it modular enough for those components to be replaced by the Rust version.

Then eventually you'll be able to replace enough components that it's all Servo, and no Gecko. At that point Servo will be functionally complete, as its individual components will be on par with the Gecko components.


I think it's unclear whether everything will get replaced by Rust components from Servo: certainly, smaller components are likely to get replaced, but larger interwoven parts are obviously going to be far harder to replace. I suspect not all of the Rust code that replaces the old C++ code will be from Servo: some will be written specifically for Gecko.


Firefox has already committed to eliminating XUL components. Once it has the correct APIs to emulate existing add-ons, mobile Firefox could probably switch over to Servo. Desktop has a lot more cruft and features, so it will probably be harder to switch.

But I wouldn't be surprised if Servo is usable on mobile in 2017.


I would be astonished if Servo is usable as a mass-market web browser in 12 months. Writing a rendering engine is hard because there is quarter of a century of legacy to deal with. By the metric of "can I use this for 100% of my browsing needs" Servo is far from being complete.

Much of Mozilla's "quantum" work is actually about improving C++ code e.g. the Quantum DOM project [1], which is a drive to improve scheduling so that a single background tab, even in the same process, can't jank the tab you are currently using [2]. Importing servo components wholesale simply wouldn't help here because Servo doesn't have a sufficiently complete DOM implementation, and I'm not sure the implementation it has contains all the desired properties.

[1] https://billmccloskey.wordpress.com/2016/10/27/mozillas-quan... [2] I am occasionally taken to refer to this project as "Presto: the good parts"


To put servo into production and let it face attackers would be a profuse test for the viability of the memory safety claims.


Rust is already used in stable Firefox. https://hacks.mozilla.org/2016/07/shipping-rust-in-firefox/


Fx has enough holes already to draw attention away from the Rust modules. >:]


Does that mean that the original title was bad, as the HN guidelines states, "Otherwise please use the original title, unless it is misleading or linkbait."?


I genuinely don't get all the hoo-haa about titles. In pretty much every comments section I read here, there will be at least one person who complains that the title is "linkbait" or (more often) "clickbait" - even the most boring, anodyne description of the article. They're not allowed to seem interesting in any possible way.

The original title wasn't particularly descriptive. That doesn't mean it was misleading, and actually, I thought it was very apt: they're talking about a technology to swap pieces of Servo in over time, and the jet engine analogy was appropriate (particularly with respect to the complexity). I quite liked it.


I think the concern here is more that the blog of a "Designer Advocate at Mozilla" adds context which is totally lost when "Replacing the Jet Engine While Still Flying" appears on HN. It's a great title for the blog post, but it's both misleading and uninformative when pulled verbatim from its natural habitat.


There is a real concern among a number of users here, including me, that HN might turn into reddit if we aren't careful.

That said in this case I agree with you.

I also think we are mostly fine for now but I still hesitate to upvote funny comments.


Would you care to elaborate? As someone who spends about half my free time on both websites, I am experiencing a mixture of hurt and intrigue.


It is a long-standing fear amongst HNers, because HN was created as an alternative space due to reddit declining. [1] It's such a prominent thing that it's in the site guidelines: [2]

> If your account is less than a year old, please don't submit comments saying that HN is turning into Reddit. It's a common semi-noob illusion, as old as the hills.

1: https://techcrunch.com/2013/05/18/the-evolution-of-hacker-ne...

2: https://news.ycombinator.com/newsguidelines.html


On the HN frontpage per se, titles are pretty much the only content, so the bias is obvious.


Perhaps there was concern the audience would assume Mozilla had pivoted into the aerospace industry.


The title wasn't misleading at all, I totally understood it was about web browser rendering engines.



Why is the title back again?

No one can understand what the link is about with just engine and jet as nouns in the title.


I'm on this rust hype train for a long time. This move by mozilla is probably going to be the biggest success story that might give rust a lot of attention. I hope the new engine is fast as heck and uses less battery in all my devices. Please!


If performance is not competitive, it will not be much of a success story, but it will be more interesting to see how much better it will be in terms of security issues.


It was better 2 years ago (with caveats) - https://www.phoronix.com/scan.php?page=news_item&px=MTgzNDA

But there's definitely place for speed gains due to multithreading.


After years of the multicore hype, we finally can reap its benefits.


>"If performance is not competitive"

I don't think this will be a problem. One example...

https://www.youtube.com/watch?v=BTURkjYJ_uk

https://www.youtube.com/watch?v=erfnCaeLxSI


Not if in the process they'll make all the existing add-ons incompatible and therefore lose all their remaining market share.


I disagree. If they are able to deliver a browser that's much faster and consumes less battery, I believe that would be enough to make a lot of users change their browser.

Maybe I'm wrong and I'm not a good example, but I the only add-ons that I have installed that I really care about is 1password (which is easy to live without, as I can use the stand-alone application) and HTTPS everywhere (which just means that I have to be extra careful when browsing).

I'm pretty sure that there are a lot of internet users that do not even have add-ons in their browsers, but I doubt that there's any user who would not care about browser speed/battery life.


AFAIK, about 40% of Firefox users don't have any add-ons.


Furthermore, the recent e10s transition was already tree shake of the long tail of ancient add-ons in terms of how many still had old bindings and used ancient versions of the NPAPI.

Then there is the fact that every other browser has, or is in the process, dropped any notion of direct native add-on APIs and an increasing majority of actively developed add-ons for all browsers is now modern "Chrome-style" JS+HTML components.


You seem to be confusing NPAPI plugins with browser extensions.

Firefox doesn't allow addons that contain binary XPCOM components anymore. Everything has to be JS, whether the addon is XPCOM based, addon SDK (JetPack) based, or WebExtensions-based.

NPAPI is deprecated and is to the point where it is only used for Flash.


Sorry, yes I was using NPAPI as a handwaving shorthand for both NPAPI and binary XPCOM to save time. I appreciate the extra technical detail, though. (Because technically correct is the best kind of correct.)


I was honestly expecting an interesting story about a real jet engine being replaced in flight.


Some piston-engined aicraft had crawlways in the wing to permit inflight engine servicing ( Ju-90, XB-15 and B-36 come to mind ) but sadly I'm unaware of any jet aircraft with such a clever feature. Potentially the YB-49 flying-wing but I haven't located a cutaway to confirm.


The story changed its title from being about Gecko and Servo to this literally in between me viewing the frontpage and clicking on the comments link.

This story here is IMO should be extempt from the "original title" rule - the original title is annoyingly misleading.


Leave the title, append it with "Transitioning from Gecko to Servo".


Yup, as soon as possible. I was also looking forward to read about jet engine being replaced in mid air.


Ok, we'll use that wording instead. Thanks!


Me too! I was like "Whoa that's a novel idea...and it is about browsers...damn!" I was already conjuring up crazy scenarios similar to in-flight refueling.


Same here. was click-baited to another Rust article :D


I was disappointed after reading the title when I saw the article...


Yeah, talk about setting people up for a letdown. :) I was expecting some insane experiment from the golden era of aviation.


And the analogy is a bad one, too. I think it's trying to make the task sound more difficult than it is.

If we stick with the airplane analogy, maybe this is better: it's like incrementally converting an older model jet engine to a newer model jet engine in between flights. Every incremental version must work correctly, which isn't easy, but the upgrades are definitely performed on the ground.


Wouldn't it be more appropriate to have written rust in a GPU language like OpenGL or OpenCL, since the aim here was to replace the display engine?

Curious as to why use a CPU language for the display engine rewrite? It seems to be wasting the GPU all modern systems now contain..

Also, is it fully replacing the display engine, including the font renderer and image codecs?


Servo is not a "display engine". The term is "rendering engine", but that's an inaccurate bit of terminology, most of what a rendering engine does doesn't actually deal with rendering. A better term is "browser engine".

Parsing, styling, the DOM, running JS, handling networking, computing layout are all things that a browser engine does that don't involve rendering. Many of these can't be done on the GPU, and for others the GPU brings no additional benefit because it's not the kind of load the GPU can magically parallelize.

Servo's rendering stack does make extensive use of the GPU. https://github.com/servo/webrender/ (talk by patrick in https://air.mozilla.org/bay-area-rust-meetup-february-2016/) There's also work on glyph rasterization on the GPU going on right now.


Servo makes incredibly good use of the GPU. Being able to do that, and iterate so fast on that work, is part of what Rust brings to the table.

(I don't think they've swapped out the font renderer and image codecs yet, but there are quite a few bits and pieces they have swapped out as the Rust community delivers more pure Rust libraries.)


> Servo makes incredibly good use of the GPU.

Too good, in fact: it won't run on any laptops that I own because their GPUs don't support a sufficient level of OpenGL.

One great aspect of Firefox-with-Gecko is that you can throw it onto just about any machine with more than 512MB of RAM and it will provide bootstrap web access ( slow or otherwise ). That's going to be lost when everything is Servoised. I guess it's back to links2 at that point.


How old are your laptops?

For ease of development, webrender2 is configured to compile with the latest OpenGl 4.x features. I believe there is a way to build it to target a lower OpenGl version.

Edit: Correction, its currently targeted at OpenGl 3.2. There's an open issue for bringing it down to OpenGl 3.1 which is the version supported by integrated Sandybridge GPUs. Considering 40% of people running Intel are using Sandybridge or older, that's probably why it wont run on your laptops.


Youngest laptop here is from 2011:

Intel Corporation Mobile GM965/GL960 Integrated Graphics Controller

OpenGL renderer string: Mesa DRI Intel(R) 965GM

Maximum OpenGL version for that chipset is 2.0, apparently, though I can't find a way to push it beyond 1.4 on Linux

I appreciate that six years old is ancient by SV standards but I would be interested to learn what proportion of Firefox users are on equally old hardware. I did try launching Servo with CPU rendering but it hung indefinitely.


Servo isn't a production ready consumer browser and it won't be for a long time because its purpose is to facilitate research into parallel browser engines. While a 2007 chipset may not be very old among the average consumer, it is probably not going to work for a developer machine that's going to be compiling a huge codebase under active development.

I don't think the software fallback has gotten much love but AFAIK it'll work fine once llvmpipe support is added to webrender2 [1]. Once that is done, rendering will work with the vast majority of laptops and desktops running Linux/MacOS/Windows. The OpenGL version supported by that chipset is over a decade old so devoting time to compatibility this early in the project would be a waste of time (and defeat the purpose since the standard predates the explosion of mobile devices).

[1] https://github.com/servo/servo/issues/13653


AFAIK that's not actually true, one option they have is going through Mesa's llvmpipe in such cases.

EDIT: to be clear, with the Servo nightlies you need to do this yourself but Firefox could bundle it eventually.


This is a misunderstanding of what a rendering engine does. It operates on at least two layers:

1. On the CPU, architecting the drawing commands and resource management for the GPU

2. GPU programs (in the form of API calls, shaders, output buffers, etc.)

Generally speaking, part 1 is not the resource-intensive part unless you also have to run an intensive algorithm on the CPU. If you see the renderer as a synthesizer unit, the CPU is the number of user parameters you have - the number of knobs and switches and sliders and patch points. You would typically only have a few knobs to turn, if you were writing a custom renderer for your application. But a complex, general-purpose renderer like that of the HTML DOM is more like a wall of patch cables; and within that, the Servo renderer found opportunities to make "risky" optimizations that parallelize, make tighter use of memory, etc., but many of them are only reasonable to do from within Rust, because of the additional checks it can perform.


It's more than just architecting the drawing commands and resource management. That's just the renderer.

The term "rendering engine" is misleading; in the context of browsers the rendering engine does basically all the core browser tasks. Basically, all the stuff needed to make a web page _work_ is part of the rendering engine. Stuff like history, bookmarks, URL bar, is part of the rest of the browser. See https://news.ycombinator.com/item?id=13331505

Servo's usage of Rust is all for the rest of the stuff. We do have a component, http://github.com/servo/webrender/, which handles rendering in particular. That indeed does the architecting of the draw commands and resource management on the CPU, and has a bunch of shaders handling everything else.


> Wouldn't it be more appropriate to have written rust in a GPU language like OpenGL or OpenCL, since the aim here was to replace the display engine?

That's not the aim of Rust. Rust, according to its own website is:

> Rust is a systems programming language that runs blazingly fast, prevents segfaults, and guarantees thread safety.


Sorry I meant 'servo' not Rust.


Editing might be a good idea, it now sounds as if you have no clue what's going on while in a subcomment you mention it was just a mistake (can happen).


This didnt deliver.


[flagged]


I don't know how to say this in a civil way, so I won't: Please take your sexist bullshit somewhere else if you can't behave like an adult.


Please don't feed trolls. Instead, flag egregious comments by clicking on their timestamp to go to their page, then clicking 'flag' at the top. (You need > 30 karma to see flag links.) We monitor the flags and take action based on them. In the present case, we banned the GP account.


You did know, but didn't want, actually. Now we have this thread appeared from nowhere. But main point, besides what catched your eye, is that an article is written with pretty lame stylistic and is hard to read and get seriously.


I'm sure you didn't mean it that way, but that sounded a little bit sexist.


She tried a conversational writing style and failed.


Seems that you can't even combine 'she' and 'failed' in one sentence here on HN. Looking for alternatives, since the issue still persists.


Your parent comment is nearly void of substance and dismissive. It's okay to have a criticism of the the writing style, but without some additional information about why (e.g., examples, comparisons with similar styles of writing), it adds little constructive to the conversation. That in and of itself can draw downvotes.


Reasons why were listed in our root comment, and parent posted at the time when it was not flagged. Ironically, parent comment even somewhat protected author's lameness, but you can't see that now. I blame the blind power of sjw here, not anything else.


I browse with showdead on and had read the thread prior to my comment. If the reasons are listed upthread, there's even less reason for the unsubstantive addition. Constructive discussions require adding something substantial. If the comment was intended to support the author in some way, it needs to justify that as well, or even make the case that that's the intent.


By the time they will finally make the switch, Firefox' market share will probably have plunged to a very low single digit percentage.


Actually, it's already there. Has been since last year.


Missed opportunity to name the CSS engine replacement project "Gangnam".


I just wish that their UI (UX? ugh...) team was as competent as their engine team appears to be.


What is your complaint, exactly? Considering it includes support for almost total customization of the user interface by third-party code, I think they've done an excellent job with the UX.


I expect that support to vanish under the dual guise of multi-process and "security". Never mind that more and more Firefox have been taking on the appearance of a Chrome clone, rather than offering a distinct look.


Have you seen the browser.html project?


This seems apt: https://www.joelonsoftware.com/2000/04/06/things-you-should-...

Mozilla spending time on this suggests Mozilla doesn't know what to spend time on.


It seems you either haven't read or understood either Joel's article or this article.

They are going for small incremental changes, instead of rewriting code from scratch, which is what Joel warned against. They aren't saying, fuck Firefox, replace it with Servo. Instead the idea is use Servo as a way to discover better layout, rendering components and transplant it into Firefox.

See Strangular Application pattern

http://www.martinfowler.com/bliki/StranglerApplication.html


I see what you're saying, however they don't seem to be "strangling" Gecko in the Strangler-pattern sense of the word. Rather it seems they're borrowing smaller pieces (Stylo) written in a foreign language and integrating them with Gecko as the main engine.

To me that sounds like the end-state is a Chimera and not a Lion, without any incremental end-user benefits along the way.


> To me that sounds like the end-state is a Chimera and not a Lion

This is an aesthetic judgment. There is no such things as «a Lion» in the real world. Engineering is the work of building chimeras that solve your problems.

> without any incremental end-user benefits along the way.

The expected benefits are :

- speed (thanks to the massive parallelization permitted by Rust «fearless concurrency»)

- reliability (thanks to Rust's memory safety to avoid crashes and better type system that helps reducing the number of bugs)

- security (thanks to Rust's memory safety to avoid exploits)

Component from Servo will be included in Gecko one by one, the user will gain on these three fronts each time a new component is included.


Servo will still exist, so they'll always have the lion and the chimera.

Creating a new langauges to incrementally change and clean-slate rewrite is an orders-of-magnitude bigger investment than what either what Joel's criticizing or advocating. I don't think that article is very relevant here one way or the other.


Nothing about that pattern says you need to replace everything... After all, not all vines completely replace the host.

There can be legacy system that can just be maintained without modifying. Like for example JavaScript engine.


Ironically Mozilla is the best counter-example to that Joel on Software article.

The article specifically mentions that Netscape shouldn't try and re-write their browser. But if they hadn't done that, we wouldn't have Firefox. We'd still have an incremental improvement over Netscape 4.7.

For those who can't remember Netscape 4.7, let me remind you, it wasn't very good at the end. If they hadn't rewritten it, causing Joel to write that article to tell them they were wrong, there would be no relevant browser called Firefox today.


The complete rewrite wasn't initially too good either. It took them years to bring it on par with Netscape 4.7, warts and all.

I've been in a talk of a mozilla developer like a decade or more ago, shortly after they had released the rewrite. His comment on starting a project like that from scratch: "don't". What they intend to do (slowly and gradually transition to Servo) seems consistent with an organization that has learned from this experience.


Ah, the "Mariner" cancellation debacle.


Netscape as a company owned by AOL pretty much died during the process of the rewrite. Although parts lived on in the Mozilla foundation.


Then again, Gecko itself started under Netscape, they just released it under an open license after the Mozilla thing was started. So in a sense it's still an incremental improvement over Netscape 4.7, at least when it comes to the rendering engine.


Gecko was the rewrite of the rendering engine for Netscape 6. Netscape 4.7 (and earlier) didn't use Gecko.


Ah, you're absolutely right. Thank you!


> I guess we could land the plane, let all the passengers disembark so they can wander over and take other planes, and not provide any service for a while while we change the engines out… but no — no, we can't do that. Not gonna happen.

> We could keep flying the current plane, while starting from scratch and building an entirely new plane on the ground — special ordering every nut, every bolt, designing a new cockpit, and waiting many years before we to get to fly that plane. But we don't want to do that either.

Mozilla is exactly following Joel's advice.


Mozilla was a rewrite of Netscape and Firefox was a restructuring of Mozilla.

The only reason they still exist today and have relevance is their willingness to rewrite.

I don't disagree with Joel - but his main point seems to be - be prepared to spend longer than you expect on a rewrite - that doesn't mean you shouldn't do it.


Wasn't Mozilla the open source version of Netscape Communicator, rather than a rewrite?


Initially Netscape Open Sourced Mariner that was on track to becoming the engine of Netscape 5 and was an evolution of the engine of Netscape 4. Netscape 4 lacked proper DOM support and proper CSS support. (Project to add DOM support was called Perignon, IIRC.)

Netscape had purchased another company called Digital Style. Their layout tech became Gecko but was called Raptor and NGLayout before being renamed to Gecko. Mariner / Netscape 5 was canceled and Mozilla migrated to Gecko, which was used in Netscape 6.

SpiderMonkey survived the transition from Mariner to Gecko and is used in Servo, too.


Netscape 4 was pre XUL.

Mozilla browser was Netscape 6 with XUL 'hotness' and massive UI lag.

Firefox was stripped down and fast (though XUL is apparently still around).


Mozilla started as a big rewrite of Communicator 4, which was then open-sourced at an early stage. It wasn't great because it concentrated a bit too much on rewriting underpinnings (XUL, Gecko etc), resulting in a relatively poor UI.

Eventually a skunkwork project was started to drop all non-browser components and focus on user UI, and that became Firefox.


You're probably right. An earlier version of Netscape was a rewrite.

Mozilla was a refactored version of Netscape (to remove closed-source dependencies)


Others have pointed out that Netscape's rewrite gave us Firefox, so the rewrite was incredibly successful.


If you call Firefox incredibly successful.


As a charity with hundreds of millions in revenue and employing over 1000 people I'd call Firefox and Mozilla pretty successful.


It has enjoyed success during its lifetime, but the userbase has looked like an upside-down hockey stick for some years now. I would say that having your userbase drop from a dominant majority of the market to well below 10% of the market is indicative of failure.


Firefox was never a dominant majority. It peaked at ~25-30% pre-Chrome (when it was the only major competitor to IE).


Firefox gained users since October 2016


Joel's article is about doing everything from scratch, while if I read it correctly the article talks about doing things in-place, alongside normal use and development.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: