Not all. Here's one example of where live streaming beats progressive download:
You are a website which serves an extraordinary number of videos, most of which are only viewed by a few users (e.g. Facebook). A small percentage of these users want to view your video on an iPhone, so you re-encode it on the fly from the regular file on the website to display on iPhones. In addition, this lets you adapt the bandwidth on the fly to maximize quality without the signal dropping out.
One other small factor is cost. Lots of places go with streaming for cost, either to limit the amount of unnecessary download or to add in ads to pay for it (or limit streaming just to their site initially). For instance a live event especially where you want to make sure availability is the best situation it can be. Also for larger movie files such as maybe on Netflix streaming is more cost efficient as lots of people stop watching or partially watch. No need to download everything at .10 a GB for unwatched content.
Either way it sucks but it lives on due to market demands.
Good points. Throttling works well with a good infrastructure and haven't used seeking support but will look into it. Seems like there is overhead with streaming and seeking but maybe throttling is the way to go. Still you'd need to be able dynamically adjust it to the users connection and the data rate of the video, that is kinda what is nice about streaming which is on top.
Are you kidding? Live streaming is MASSIVELY more expensive than HTTP/1.1 because of the infrastructure needed to support the hard latency and QoS requirements.
We are talking about real-time, live events. Probably DRM or paid events that require as close to real-time as possible. Sure you could delay and push out chunks of video into your CDNs and then people woudl all be in different parts of your "live" event. For live events streaming is almost always required. For DRM like sites like Netflix this is also the case.
Yes HTTP hosting is massively cheaper than streaming but the topic was real-time live events or controlling the content like DRM.
Up until a year ago, Theora was total dogshit, dramatically worse than H.263 (RealVideo, Flash 7, etc.) -- Xiph had made basically no effort at all to improve the codec itself, they just made a lot of fuss about integrating it with their custom container format, and froze the format in 2004 (Not that it really mattered -- there have never been other implementations).
A year ago, the 'Thusnelda' development fork of the encoder was started, and made dramatic quality improvements -- the bleeding-edge library now achieves about the same level of mediocrity as H.263 -- welcome to the early 90s!
I believe the "there have never been other implementations" to be incorrect. The Cortado Java Applet is a a second implementation of the Theora code for example.
In addition to any technical argument, Theora is basically a renamed On2 VP3 (very little was modified from the original specification); if one goes by the fact that VP3 was originally meant to compete with MPEG-2 back in the mid-1990s, it is "obsolete" in a temporal sense.
On2 has since superseded VP3 with VP6 and now VP7 (and soon, supposedly, VP8), so in that sense it is "obsolete" as well.
Well just like Linux, Apache, MySQL and Mozilla it's based on technology going back years (decades in some cases). You can easily confuse people by saying something is a "70s era", or "mid-90s" technology because they'll infer that whatever you are pushing is current, even if it too is built on top of technology of the same era.
See my quote of his opinions on non-Apple browsers if you're still wondering whether this author is trollish or not: