Hacker News new | past | comments | ask | show | jobs | submit login
A JavaScript H.264 decoder (github.com/mbebenita)
142 points by tnajdek on Oct 28, 2011 | hide | past | favorite | 41 comments



Apparently this can actually get decent frame rates. Although I suspect they are using the patched version of the JS interpreter mentioned on github.

http://yfrog.com/nmng0z

Still not sure how this is meant to actually be useful though. The problem with H.264 isn't availability of implementations, it's being non-free and heavily patented.

I can kind of see a use for this if you are a big content provider with a bunch of cash and want to distribute video to Firefox users without transcoding but it still seems pretty derpy.


> Although I suspect they are using the patched version of the JS interpreter mentioned on github.

We are using standard Firefox, no special patches. However, we used the Firefox nightly, not current stable. The decoder runs much faster in nightly, due to JS engine improvements that landed over the last few months, and are not yet in stable.

> Still not sure how this is meant to actually be useful though. The problem with H.264 isn't availability of implementations, it's being non-free and heavily patented.

First thing, this at least gives you another option. That is, if we get this codec to run as fast as a native one, then we now have the choice of either the browser or the video website providing the decoder (and properly licensing the decoder, if they are in a country that has pure software patents). More options are never a bad thing.

But I think the real potential in this approach is something entirely different. The opportunity is that you can download arbitrary decoders. So instead of the current world we live in, where you have a few decoders installed, you can have custom ones for different websites. Imagine a website that has cartoon videos or anime etc. - in principle, they could use a custom codec that is heavily optimized for that kind of content, as opposed to being forced to use stock decoders.

Also, it prevents being frozen in time: If you can download decoders from the web, you can improve them constantly while making sure your users have the proper decoder (since you ship it to them yourself), which you can't do if you rely on stock preinstalled decoders.


> if we get this codec to run as fast as a native one

Native ones have chunks written in hand-tuned assembly language, offload parts to specialized hardware, and other such tricks not available to ECMAScript. I'm not even sure why "as fast" is being considered a possibility.


I agree in general that native decoders can be faster - they can in principle do anything a JS decoder can, and in addition the things you mention. However,

1. JS can also use hardware acceleration through WebGL. We have not done this yet, but will. 2. JS has some proposed extensions, WebCL and Intel's RiverTrail, which let it utilize SIMD and other computing hardware. We will investigate using those too.

With those two things, we believe JS performance will be very good. How close it will be to native code, though, is hard to say at this point in time.

However, there is one big advantage a JS decoder would have, that native code does not: A JS decoder can be downloaded and run securely. As a consequence, you can continually improve your decoder in JS and your users will run the latest, most optimized version, while standard native decoders are typically upgraded much, much less frequently. Also worth noting is the potential to ship specialized decoders, as I mentioned in another comment, imagine an anime website that ships a video decoder heavily optimized for that specific type of content. That could be much more efficient than a stock native decoder.

Finally, it's worth noting that the decoder we compiled from C, Android's H.264 decoder, does not have any substantial amount of handwritten assembly. I had assumed like you said, that real-world decoders would have such things, and am curious why it doesn't. If anyone reading knows the answer I'd be very interested in that.


Well, for instance, we can take advantage of hardware features like shaders in WebGL or make use of WebCL, rivertrail, etc., which are all available to JavaScript. Perhaps we may not reach the performance of native codecs, but we can get close enough.


Brendan's demo was running at 30fps, I believe on a Macbook Pro (I'll find out). Some content will run slower right now, which is true of all codecs AFAIK but is more so for this stuff right now.

The patches linked from the github README aren't necessary to run the transpiled version that was in the demo -- it's a memory optimization that's being used in the tuned-for-JS version.

As regards derpiness: it lets content distributors decide to pay for H.264, if they want, exactly, and paves the way for other codecs to be deployed by such distributors as well. It also runs the codec in a managed environment -- format decoders are often very fertile territory for exploitable bugs, since they are pretty much by definition all about pointer math and byte-poking. But the initial intent, when they decided a week ago to try it, was to push the envelope of JS performance such that we find new ways to extend said envelope.


It was on a MacBook Pro. We tried it on a MacBook Air too and it performed reasonably well.


My 2007 MBP manages a decent 20 fps but starts pretty slow and averages 15.

That's a 2.4GHz C2D. Need to try it on my i52400.


You should give the latest version a try, we improved the performance considerably.


Interestingly, my i52400 only manages 25 fps (average over 1 minute). Hmmm. Ahhh, it's only single-threaded.

Is that intentional, or a problem with Nightly?


It's a proof of the power of javascript. In another 10 years it'll be even faster due to faster compilers and faster processors.

Perhaps by then it'll be completely routine for video to be encoded and decoded using Javascript on a mobile device.


Useful? Not everything needs utility. It can just be cool. This is definitely the latter, although more as a Javascript exercise than a video one.


Any one know what the legal implications of using a free decoder for a patented compression format are? Products like flash make it seem free as Adobe pays a flat rate to the patent holders for decoding mp3, h.264, etc. alleviating their users from having to worry about royalties. But would using this decoder, though free and open source, be infringing a patent?

note: I realize it's not production code. I'm asking more about the concept.


Use and distribution of implementations requires a license (though, curiously, x264 seems to get a pass), so using the decoder will require a license as with using a closed-source one.


(though, curiously, x264 seems to get a pass)

If x264 does, so does VLC, mplayer, ffmpeg, gstreamer, and dozens of other applications that use video and audio decoders in Linux. Fortunately quite a lot of the world is not the United States, and today VLC is the world's second most popular media player and has never paid one cent in patent licensing fees.

But of course, yes, being open source does not magically exempt you from patent laws in countries with insane, broken patent laws. In practice, if you want to make a large-scale commercial application that will be distributed in the US that uses x264, you will probably need to pay for an MPEG-LA license. They're quite cheap, though: 0 cents per unit up to 100k units, 20 cents per unit after that until 5 million, and 10 cents per unit after that.

Largely, the question of whether a distro contains any particular piece of software is whether the people who run the repositories are willing to host it. This applies both to possibly-patented software, but also to libraries like DeCSS (necessary to play DVDs) that violate the laws in some countries, but not others.


You're right, of course. While I believe that the MPEG-LA's enforcement practices are not non-discrimatory as it's claimed to be, that's a different discussion, and it wasn't appropriate to single x264 out that way.

I apologize.


Hopefully MPEG LA is seeing this and will sort out licensing for websites using this.


Only if it's in a commercial situation.


No, it's noncommercial (really not user-monetized) streaming of video that's free. Shipping H.264 hardware requires paying licenses regardless of whether it's a commercial situation or not. For software, if the patents are enforceable in your jurisdiction, it's the same thing.


At least until 2015, you won't need a license unless you're distributing commercial content to other end users or building an H.264 encoder. And the MPEG-LA has agreed to never charge anyone for watching free videos, so Youtube viewers will never get shaken down. http://www.engadget.com/2010/05/04/know-your-rights-h-264-pa... Maybe after 2015, they will start collecting royalties on commercial decoders.



Contrary to what certain parties like Mozilla may claim, by the MPEG licenses, there are none. We have had this topic up here a few hundred times, mostly related to the recent h.264/WebM debacle and all the fud and disinformation spread about usage of h.264. There is a reason why the x264 team, the XviD team, the L.A.M.E team, people offering h.264 video and AAC audio etc. for free have not been subjected to lawsuits from the MPEG, and this is why: MPEG has always, not just for h.264 but also for mp3, mpeg-4 asp etc., said that there are no costs of any kind involved for any party - distributors and end-users alike - dealing with MPEG data or MPEG technology in any way, as long as it is all done in a free-of-charge scenario.


No, but this is a common misconception. Though the summaries talk about "sold to users", the license itself defines "sale" in such a way as to include zero-cost distribution:

"Sale (Sell) (Sold) (Seller) – shall mean any sale, rental, lease, license, copying, transfer, reproduction, Transmission, or other form of distribution of an AVC Product or the Transmission by any means of AVC Video either directly or through a chain of distribution."

This is consistent with the discussions I've been party to with the MPEG-LA directly.

Edited to add: I would very much appreciate a reference to MPEG-LA saying the things you indicate in your comment. It would be very interesting to be able to point to those statements, in a number of ways.


Running on a macbook pro with i5 dual core 2.4 ghz, it uses 1 core at 100% in firefox and manages:

1.7 fps

It's cool, but I don't see any real world applications. Anyone got any ideas?


You need Firefox nightly for it to run well, because it depends on the hybrid static/dynamic type inference work and some other optimizations. I believe that Brendan was demoing it from a Mac laptop of some kind.

As far as applications, it will let content providers who want to ship H.264 and pay the license fees do so without requiring all browser developers to pay such fees. (For Firefox it would be a meaningful portion of Mozilla's engineering budget.)

And it's a pretty compelling demo, IMO, of the fact that we're not done with JS performance yet, and that people don't need to be running to Dart or NaCl or other rip-and-replace technologies in order to get great perf. There's a lot of opportunity for even better performance on the engine side as well as the library side: they started a week ago with emscripten, Closure and the libstagefright C sources...


1. This works much faster on Firefox nightly (that's where we demo'd it on). Nightly has a lot of JS engine improvements that are not in stable yet (but will be in a month or two).

2. This does not use any hardware acceleration yet. This is simply compiling the Android C decoder into JS, nothing else - just a few days of work. We will now start to look at actually optimizing the code specifically for JS, and also to use GPU shaders for the relevant part of the code. Both of those can potentially make this even faster.

Regarding real-world applications, the interesting possibility is for websites to ship their own codecs. As I mentioned in a comment above, imagine an anime video website that ships a video decoder optimized for that kind of content - it could be much more efficient than stock H.264. Also, if websites can ship their own decoders, they can continually improve them (unlike now where websites rely on client decoders which are not constantly being improved).


With today's Nightly I'm getting ~20fps on a 3.4 GHz Core i7 iMac but with drops into the 7fps range on some video transitions. Surprisingly, I get roughly the same perf with a current Aurora as well but the drops only go down to ~12fps. Trying it on the released Firefox gets around 2fps.


I think it's more of an (impressive) mental exercise - if the code can be made straightforward enough to run on javascript, the future applications can only grow from there.


It's the Android encoder written in c++ cross compiled to JavaScript, not much of a mental exercise. Not to say it's not cool :)


They have a hand-coded javascript version in progress too:

https://github.com/mbebenita/Broadway/tree/master/Play


Strange.. It was running at 4 fps on my Pentium 4 lol


hmmm....

What browser, OS and what Pentium 4 processor? And for how long did you run the video?

Mine drops to 1.7 after a few seconds, the first few frames are dark so in the beginning I get 6 fps.


Nightly, XP, 2.4 Ghz. Nearly 5 Minutes.. Here it starts at like 0.6 fps, and then runs at 4-7 fps onwards.


Must be something with Nightly then, I'm just running the regular version.

Says a lot about the importance of software over hardware :)


It definitely needs nightly. I'm not sure what's up with salmanapk's setup, will need to dig into it.


Here's the BadassJS perspective on all of this very impressive stuff! http://badassjs.com/post/12035631618/broadway-an-h-264-decod...


Wow. 45 fps in a Firefox nightly, 20 fps in Chromium. Pretty impressive for JS.


Give it another try. We made a few small improvements that turned out to be quite significant.


Funny hobby project.


... said the dinosaur to the first mammal, re: size, fur, and all that hobby fluff.


You just can't compare a JS library for h.264 decoding with ditto made in C. The latter is usable while the former is practically useless even for an extreme case of f.e. online, browser-based video editing. What is the point you are trying to make?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: