Hacker News new | past | comments | ask | show | jobs | submit login
Servo – Render in parallel (github.com/mozilla)
123 points by heydenberk on June 20, 2014 | hide | past | favorite | 27 comments



Note that Chromium has been doing this for a while with impl-side painting.

What is exciting about this in the context of Servo, though, is that it completes the parallel pipeline:

* Script can be parallelized through PJS or Web Workers.

* All phases of layout are fully parallelized: selector matching is performed in parallel (one micro-task per DOM node), layout is performed in parallel, and display list construction is performed in parallel (one micro-task per render object).

* Painting is now performed in parallel on a tile-by-tile basis.

* Compositing is performed in parallel thanks to the GPU.

All of these tasks can also be performed in parallel with one another.


One risk the Servo team might have is the end-user latency. I'm wondering how it will play out that having all these asynchronous waits can result in a worse end-user experience. User interaction lag can be incredibly frustrating. Every opportunity taken to perform tasks asynchronously exacts a cost in terms of synchronization latency.

I hope the Servo team is acquainted with the video game world, and the curious optimizations that have been taken in that realm to greatly improve performance in multi-threaded settings.


Our tasks can generally operate independently. For example, if the painting task is stalled, scrolling can happen in the compositor; if the layout task is stalled, painting can happen on the retained display list; if the script task is stalled, layout can still happen on the retained render object tree. This is only possible because our data structures are cleanly separated between all the tasks. I'm confident that this approach will result in good latency.


How far out is servo from being the mainline Firefox renderer? A year? More? Pie-in-the-sky?


Servo is a prototype – closest to your "Pie-in-the-sky" option.

I'm not affiliated with Mozilla or Servo but from the information on the Servo page and the number of Acid2 and other issues they're tracking, it's clear that there's still plenty of work to do. Throw in the fact that the Rust language itself is targeting the end of the year before they hit version 1.0 and you'd have to guess that Servo as a separate project likely has minimum 6 months (more likely a year) before they examine whether it's successful enough to start integrating with Firefox.

Then they'd need to integrate and test.

Think about how long the new "Australis" UI was in development (more than 2 years). It was just a user-interface change without changing programming language or other dev-tools.

The renderer is the core of the program. And integrating Servo would involve integrating and testing a new language along with the new code. I doubt a large, capable team could perform that much integration and testing in under 12 months – even after Servo itself was considered "complete" (which it isn't).

My prediction: a release of Firefox with Servo code is 2 years away or more (assuming Servo is considered a "success" in 6-12 months).


My guess is not so optimistic. # of real rust programmers is low. If integrating Firefox with Servo means having 1 small part of Firefox integrated with Servo, mabye, a month of work? But the entire Firefox render engine with Servo will probably be at least 5 years away. While 5 years seems a long time, don't forget time fly and bug blockers come up.


It's been stated before that Servo may get a Webkit-compatible interface, meaning you should be able to build a Chromium with Servo pretty easily, and far before it would take to integrate Servo into Firefox.

I'm not sure if that's still the plan, however.


Does Webkit have a very large API/ABI surface? This seems like a nightmare to implement correctly and maintain.


I have no idea.


The idea is to start being dogfoodable for the team by the end of the year, but that only covers as tiny subset of sites that are used by the team, maybe Etherpad (which is used for meetings) and /r/rust. The main focus with Servo so far has been to look at the real performance bottlenecks that other engines face during layout and parallelize them. The plethora of other features that need to be supported will come later after these problems have been solved (they mostly have been). That said, replacing Gecko is a looong way off.


I always see the servo team constantly benchmarking things that are parallelize to make sure the parallelization is worthwhile. So if anything, I'd expect end-user latency to decrease substantially.

I think the biggest improvements may come from experimenting with new asynchronous DOM APIs, which the browser can intelligently batch up for performance. Unfortunately, web developers will have to use these APIs for them to have any effect, and it seems to be a long way away right now for real world use. But it's part of what Servo will be helping to explore, as I understand it.


I would assume that given sufficient practical performance evidence, you could device heuristics to determine when to parallelize based on DOM complexity.


This is also useful as a demonstration of how the Servo developers use bots to augment the review of pull requests. See also "bors", the Rust compiler's continuous integration bot: http://buildbot.rust-lang.org/bors/bors.html

It would be very interesting to survey the largest projects being developed on Github to determine if they've also grown bespoke ways of dealing with the platform's pain points. I know that the Rust developers aren't entirely satisfied with the tools that Github provides (which seem to be optimized for small projects at the expense of large ones), but Github's network effects are just too good to ignore for a project that relies on volunteer contributions.


> which seem to be optimized for small projects at the expense of large ones

Definitely, you easily hit its limitations with big projects:

* the commits view is completely useless if development is highly branchy (although to be fair that's the case for pretty much all git history visualisation tools)

* the CI integration (statuses[0]) is very limited with only 4 states (pending, success, error and failure) and doesn't work OOTB with merged heads

* issues-filtering tools have odd cases when you're trying to e.g. filter by multiple tags

* and they're not available at all the PR list

* the notifications system is insufficient and incomplete (IIRC you're not notified when assigned on an issue or PR…)

* the teams integration is lacking (essentially, teams are only very broad ACLs, you can't triage issues to teams so you have to use tags, except when you start doing complex queries involving multiple tags it all breaks down)

[0] https://developer.github.com/v3/repos/statuses/


I have seen many projects have little bots do things like post to third-party review services or whatever. Many projects use CI tools that have GitHub integration (e.g. travis), Rust's is just rolling their own so they needed their own GitHub integration. We should be lauding GitHub for their willingness to support integration, even if their native tools are a tad lacking.


AFAIK Bors doesn't use any github integration, it behaves as a user (and I'm pretty sure its messages are too complex to go through the statuses API and result in anything other than unreadable garbage)


Bors does have some github integration and it will automatically merge pull requests and I believe it does update the status, but yes, the details of its interactions are by posting normal comments.


Remember when some people said there was no point in buying a multicore system because no one used them effectively? I'm glad no one listened.


That was untrue even then. Even if no individual userspace program was multi-threaded or multiprocess, you still get a benefit from being able to run multiple programs at the same time (and not just time-shared on a single core).


I don't know about that, single core computers weren't known to be used in a multitasking kinda way. You usually had one main program open at any given time. It's not like my computer with a Packer box running on the left, a code editor in the middle, and Hacker New + 10 tabs on the right.

It was Word.

Or it was 3D Pinball.

The issue was Word and 3D Pinball didn't use multi-cores well and some people didn't see anything to gain. They thought "well it isn't going to make lag in Word disappear, or Pinball more fun." They completely missed that they could run both at the same time now.


And, reverted: https://github.com/mozilla/servo/pull/2685. Broken on mac. Maybe next week...


And this patch fixes on mac: https://github.com/mozilla/servo/pull/2687


Can Servo be used to automatically create screenshots of user-submitted (and untrusted) URLs?


> ./servo -co output.png ../src/test/html/about-mozilla.html


Thanks mods for the title change. I wasn't sure how to provide context to a pretty bare-titled page. Anyway, this is exciting stuff.


can servo be used to run tests sort of like how selenium does it?


No, Servo doesn't have any WebDriver support as of yet. I'm curious what your usecase is though; at this point servo is a quite a way off being even dogfood quality. It's not something that you could use to test your website.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: