Note that Chromium has been doing this for a while with impl-side painting.
What is exciting about this in the context of Servo, though, is that it completes the parallel pipeline:
* Script can be parallelized through PJS or Web Workers.
* All phases of layout are fully parallelized: selector matching is performed in parallel (one micro-task per DOM node), layout is performed in parallel, and display list construction is performed in parallel (one micro-task per render object).
* Painting is now performed in parallel on a tile-by-tile basis.
* Compositing is performed in parallel thanks to the GPU.
All of these tasks can also be performed in parallel with one another.
One risk the Servo team might have is the end-user latency. I'm wondering how it will play out that having all these asynchronous waits can result in a worse end-user experience. User interaction lag can be incredibly frustrating. Every opportunity taken to perform tasks asynchronously exacts a cost in terms of synchronization latency.
I hope the Servo team is acquainted with the video game world, and the curious optimizations that have been taken in that realm to greatly improve performance in multi-threaded settings.
Our tasks can generally operate independently. For example, if the painting task is stalled, scrolling can happen in the compositor; if the layout task is stalled, painting can happen on the retained display list; if the script task is stalled, layout can still happen on the retained render object tree. This is only possible because our data structures are cleanly separated between all the tasks. I'm confident that this approach will result in good latency.
Servo is a prototype – closest to your "Pie-in-the-sky" option.
I'm not affiliated with Mozilla or Servo but from the information on the Servo page and the number of Acid2 and other issues they're tracking, it's clear that there's still plenty of work to do. Throw in the fact that the Rust language itself is targeting the end of the year before they hit version 1.0 and you'd have to guess that Servo as a separate project likely has minimum 6 months (more likely a year) before they examine whether it's successful enough to start integrating with Firefox.
Then they'd need to integrate and test.
Think about how long the new "Australis" UI was in development (more than 2 years). It was just a user-interface change without changing programming language or other dev-tools.
The renderer is the core of the program. And integrating Servo would involve integrating and testing a new language along with the new code. I doubt a large, capable team could perform that much integration and testing in under 12 months – even after Servo itself was considered "complete" (which it isn't).
My prediction: a release of Firefox with Servo code is 2 years away or more (assuming Servo is considered a "success" in 6-12 months).
My guess is not so optimistic. # of real rust programmers is low. If integrating Firefox with Servo means having 1 small part of Firefox integrated with Servo, mabye, a month of work? But the entire Firefox render engine with Servo will probably be at least 5 years away. While 5 years seems a long time, don't forget time fly and bug blockers come up.
It's been stated before that Servo may get a Webkit-compatible interface, meaning you should be able to build a Chromium with Servo pretty easily, and far before it would take to integrate Servo into Firefox.
The idea is to start being dogfoodable for the team by the end of the year, but that only covers as tiny subset of sites that are used by the team, maybe Etherpad (which is used for meetings) and /r/rust. The main focus with Servo so far has been to look at the real performance bottlenecks that other engines face during layout and parallelize them. The plethora of other features that need to be supported will come later after these problems have been solved (they mostly have been). That said, replacing Gecko is a looong way off.
I always see the servo team constantly benchmarking things that are parallelize to make sure the parallelization is worthwhile. So if anything, I'd expect end-user latency to decrease substantially.
I think the biggest improvements may come from experimenting with new asynchronous DOM APIs, which the browser can intelligently batch up for performance. Unfortunately, web developers will have to use these APIs for them to have any effect, and it seems to be a long way away right now for real world use. But it's part of what Servo will be helping to explore, as I understand it.
I would assume that given sufficient practical performance evidence, you could device heuristics to determine when to parallelize based on DOM complexity.
What is exciting about this in the context of Servo, though, is that it completes the parallel pipeline:
* Script can be parallelized through PJS or Web Workers.
* All phases of layout are fully parallelized: selector matching is performed in parallel (one micro-task per DOM node), layout is performed in parallel, and display list construction is performed in parallel (one micro-task per render object).
* Painting is now performed in parallel on a tile-by-tile basis.
* Compositing is performed in parallel thanks to the GPU.
All of these tasks can also be performed in parallel with one another.