Hacker News new | past | comments | ask | show | jobs | submit login
Comments on Cisco, Mozilla, and H.264 (xiphmont.livejournal.com)
182 points by 0x006A on Oct 30, 2013 | hide | past | favorite | 19 comments



Interestingly, no mention of VP9.

Although the current political battle is over VP8 vs H.264 (Baseline) for Mandatory To Implement status in WebRTC, that's only the fallback position to avoid interoperability failure. Any random codec supported by the endpoints could be used.

For example, since you're forced to buy patent rights for all of H.264 as a bundle it seems that anyone who uses H.264 for WebRTC would use H.264 High Profile when both ends have it. (Currently Cisco is only offering Baseline, but are happy to accept code for the higher levels).

But just as easily, two copies of a future Google Chrome could use VP9, or two copies of even further future Firefox could use Daala, or IE13 could use H.265.

But VP9 gets no mention at all in Xiphmont's post. Have Firefox decided officially to skip VP9? In WebRTC it would seem an obvious stepping stone towards the Daala future, even if it's usage for standard HTML5 video is less likely to be useful.


There is a ticket for VP9[1] and no mention that it might be skipped. So my guess is that it will happen eventually.

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=833023


>Interestingly, no mention of VP9.

He alludes to it (obliquely), I think, with the "competitive" here:

>Like Opus, Daala is a novel approach to codec design. It aims not to be competitive, but to win outright.


For those who don't know, Monty is the creator of the Vorbis audio codec. If there's somebody that can create a competitive free video codec, it's him.


I know who Monty is and respect him a lot. But the technology behind H.264 is a collaborative effort by many of the best in the business and more importantly a large part of the compression techniques in H.264 are covered via patent. Saying he can do this single handedly, when entire teams at Google can't do it is a bit of a stretch.

I don't know if it can be done, but its going to take a lot of smart folks to pull it off..


He's not trying to do this by himself. From Brendan's post (https://brendaneich.com/2013/10/ciscos-h-264-good-news/):

"Mozilla has assembled an engineering dream team to develop Daala, including Jean-Marc Valin, co-inventor of Opus, the new standard for audio encoding; Theora project lead Tim Terriberry; and recently Xiph co-founders Jack Moffitt, author of Icecast; and Monty Montgomery, the author of Ogg Vorbis."


At least a couple others not employed by Mozilla are also involved. I think they really do have a superb team; look at how passionate these guys are. You can throw a bunch of engineers at a problem with a commercial interest (patents!), but this easily leads to the incremental quantity-over-quality development you have with MPEG. Or you can have a few skilled & passionate fellows with some really new ideas.


They also took some techniques from published papers in the field, and many of the fundamental techniques are old enough to be patent free, so it's not like they're starting entirely from scratch.


Brendan Eich wrote javascript in 10 days by himself and look at javascript today! Don't discount a single person's work. Besides, a single Picasso that according to wikipedia was painted in an afternoon in 1932 (https://en.wikipedia.org/wiki/Le_R%C3%AAve_%28painting%29) sold in 2013 for $155 million dollars. Monty may, if all goes well, absolutely completely revolutionize digital media, so don't assume he can't. Let us see what he comes up with before we rule it inadequate.


Creating a new programming language is MUCH easier than creating a novel video codec that beats H.264.


This is actually really interesting to me. I can imagine the complexities of building a programming language (it was a undergrad course in my school!) but how is building a new multimedia codec that much more difficult?

Does it have anything to do with most multimedia techniques being patent encumbered?


I took an undergrad Compilers course at my school. We wrote a compiler from scratch in one semester. Our compiler parsed and compiled a useful subset of Pascal, but changing it to a novel programming language would have been trivial. It doesn't have to be a GOOD programming language, just a new one.

For a multimedia codec, though, there's no low-hanging fruit. H.264 is already VERY good, so beating it is hard.


This sounds like the wisdom of an MBA Manager!


It's not "don't ask, don't tell," it's "don't ask, don't care."

My agreement with Monty's assessment depends upon the definition of the "codec market". The people who write the checks to pay the royalties know the costs very well. The people who pay indirectly don't care, because the costs are low enough that they don't register any pain for them. Arguably, what has been reached is an optimal economic balance.

When incorporating footnote 1, it appears that he's talking more about the selective enforcement that MPEG LA engages in, which I could see being more of a don't ask, don't tell scenario, but I'm not sure the limited number of licenses are proof that companies don't know about their licensing issues. It's just proof that MPEG LA selectively enforces the licensing requirements; a fact that is pretty well known amongst those who are in the affected markets.

The arguments in support of an open web, with open audio & video codecs, are (under current circumstances) principle-based arguments. It is notoriously difficult to get people to take action over principled issues unless there is some practical impact that they can feel directly. Mozilla made an attempt at this by digging in their feet on the H.264 inclusion issue. Unfortunately, that stand didn't really affect end-users, because website authors worked around the issue (e.g., Flash video players).

If open web proponents want to advance their agenda, they need to be more strategic. They need to identify and execute on plans that increase end-user exposure to MPEG LA licensing restrictions in a way that has a practical impact on them.

If that's not possible, then you're left with a principled disagreement only. If that's the case, then we have to fall back to the rationalization presented in Carl Sagan's "Dragon in My Garage" argument. If there is no practical difference between open and closed video codecs, then there is no difference at all. The principle disagreement is as incorporeal as Carl's dragon.


I'm not really disagreeing with your substantive points but wanted to clear up a misunderstanding.

The MPEG-LA does not enforce anything. Companies with patents in the pool can enforce them (they can also license them outside the pool).

Companies (that aren't Lodsys like trolls) aren't likely to bring action where there isn't significant revenue to obtain unless there is a strategic threat to them.

I would also like to pick up your language. The MPEG codecs are open although they are not free, Free or Libre.


They're only "open" according to some definitions. In this list of definitions of "open standard" in Wikipedia it only meets 2.5 out of 18, and the first one is that of a group that develops the MPEG standards:

http://en.wikipedia.org/wiki/Open_standard

The main sticking point, which is also what holds up W3C and IETF acceptance is not being "royalty-free".


So they need to do something like create a completely free codec that blows current offerings out of the water? :)

I read some of the Daala "demos". I don't know video codec technology from a hole in the ground, but it sounds pretty interesting. http://people.xiph.org/~xiphmont/demo/


sigh

As I argued [1] several years ago when the FF+h264 drama was in full swing with Mozilla making their principled stand, and way too many people seemed to be hanging their hopes on Google granting the world a miracle with VP8/WebM. So while it seemed that many people got what they were hoping for, the problem was never truly solved.

Instead, the refusal to put h.263 into Firefox (even if done though some sort of proxy/external plugin that simply exported the problem to mplayer/gstreamer/whatever) didn't affect video at all (MPEG-LA still holds all kinds of patents). The real damage, though, was keeping Flash around. When presented with the compatibility mess that is the <video> tag, Flash was (and still is, most places) the obvious choice.

There's a time and a place for standing firm on your principles, but it's also important to pick your battles, and removing that bug-ridden binary blob from the browser would have been a huge win.

[ ok, rant/flame off. I'm not really angry, just bitter after dealing with this particular mess while watching an obvious solution wash away. Maybe building something in minecraft can counteract this particular Sisyphean mood... ]

[1] It was before I know about HN, unfortunately. Second two links are follow-ups attempting to explain the futility of fighting what had already been decided...

  - http://slashdot.org/comments.pl?sid=1597850&cid=31643970
  - http://slashdot.org/comments.pl?sid=1597850&cid=31644218
  - http://slashdot.org/comments.pl?sid=1730576&cid=33009474


Even Google, champion of HTML5, still forces the Flash player on YouTube for videos that have ads, regardless of what your browser supports (or doesn't). So immediately caving on h.264 wouldn't have gotten rid of Flash; it would've left the Web depending on one more not-quite-freely-implementable blob.

If nothing else, it's probably due in part to Mozilla's stance against h.264 that we're even having conversations about alternative codecs today.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: