The Gaia repo contains the JS application frameworks and system apps. The Gon repo is the Android-based kernel and userspace. I believe the name Gonk refers to the small rectangular robots from Star Wars.
Firefox OS devices will be first opened to the markets of the following countries: Brazil, Colombia, Hungary, Mexico, Montenegro, Poland, Serbia, Spain and Venezuela.
Wow this is the first global company I've seen opening up first to countries outside Western Europe + US/Canada. I wonder what the motivation for that is.
Probably the target consumers. From what I saw they're targeting the lower end of smartphones with cheaper devices and people in developing countries such as most of those above are more likely to buy something like that compared to US/Canada or Western Europe, where more people afford higher end devices.
As much as I want them to succeed, I don't think they have a chance in the western world if they don't polish their OS a bit first. I think this is exactly what they are trying to achieve by releasing the OS this way.
I think success requires much more than just some polish.
Based on the information available so far, it sounds like a very unappealing platform for developers. As a developer, you're essentially stuck using HTML, CSS, and JavaScript. I think this will drive away the best developers, who much prefer to use robust, mature languages like Java or Objective-C, rather than JavaScript. Likewise, the native frameworks of Android and iOS, for example, are much more pleasant and powerful to use than HTML and CSS.
It's not a recipe for success when good developers are driven away by a lousy technology stack. These are the developers that create the apps that really make a difference. Sure, a mediocre web developer may be able to create yet another to-do app for Firefox OS using HTML, CSS and JavaScript, but the value in what they produce is extremely minimal.
Without good apps, users will want nothing to do with it. Even then, it looks doubtful that this will offer any real benefit over the numerous existing mobile platforms. With it so unclear why developers would want to adopt it, and with it so unclear why users would want to adopt it, it just looks so much like a dead end.
"when good developers are driven away by a lousy technology stack".
This fear of JavaScript and a web stack says a lot more about you than the technologies. There are a huge number of good developers creating great software with these languages. Coming from a Java and C# background, learning JavaScript has been a real joy and made me a much better developer.
In my experience, the best developers become productive in new languages very quickly.
Oh, I don't "fear" JavaScript. It's quite the opposite situation. I have many years of experience with it, from using it with Netscape Enterprise Server in the 1990s, to browser-based development, to more recent server-side development.
Luckily, I also have many years of experience with many other programming languages and platforms. I can see quite easily how poorly JavaScript stacks up against them.
As an industry, we can do better. We have done better. It's not encouraging to see Firefox OS limit itself to the worst options available, when so much better could be done.
Given Mozilla's goals for Firefox OS can you explain how these web technologies are anything but the best option? If you think that Java, C# or Objective C would have been better choices you've misunderstood what they're trying to achieve [hint: it's about creating a mobile OS by extending the world's most widely used technology stack].
By inclination I like languages that have seriously clever type systems, but I hardly ever write code in anything that is not javascript these days, and the reason for that is that javascript can be run easily almost anywhere.
I know that if I write some useful code in javascript, I will not have to rewrite it. You can't really say the same thing about any other language.
With or without FirefoxOS, people are developing web apps. If somebody has a nice web app already, maybe they will look into whatever little bit of adaptation is needed for FirefoxOS.
> Based on the information available so far, it sounds like a very unappealing platform for developers. As a developer, you're essentially stuck using HTML, CSS, and JavaScript.
Saying you are stuck with JS is like saying iOS developers are stuck with ARM. Yes, underlying it all is ARM or JS on the respective platforms, but you can compile many other languages into those two: C#, C++, etc.
Mozilla estimates a best case of 2x slower than native with asm.js, and that's with a strict JavaScript subset that defines a machine translatable byte-code, coupled with a proprietary runtime to do the translation. They estimate 10x slower without that proprietary translation.
That's a ridiculous waste of resources, especially on battery-powered devices.
> Mozilla estimates a best case of 2x slower than native with asm.js
No, it's ~2x slower with the early prototype. It will get significantly faster than that.
> a proprietary runtime
The asm.js optimizations being worked on in Firefox do not match any definition of "proprietary" I ever heard. All the work is done openly and is open source, and that includes the spec, the Firefox optimizations and the emscripten support for asm.js.
> That's a ridiculous waste of resources, especially on battery-powered devices.
~2x slower than native is not optimal, sure. But it is in the same range of things like Java and C#, popular languages for mobile development (Android, Xamarin/Mono, etc.). And again, 2x is the starting point.
> No, it's ~2x slower with the early prototype. It will get significantly faster than that.
And we just need hardware XML accelerators and then XML will be great. This is not a new refrain from people who promote a subpar technology stack. Quality is just around the corner! We swear!
I'll believe it when I see it, and I'll be happy, because it'll mean that we're finally that much closer to the next step of dropping JavaScript as a first-order target entirely.
> The asm.js optimizations being worked on in Firefox do not match any definition of "proprietary" I ever heard. All the work is done openly and is open source, and that includes the spec, the Firefox optimizations and the emscripten support for asm.js.
It's not JavaScript. It's a custom VM built around a (bizarre) bytecode. The bytecode is an inefficient ASCII representation which happens to be interpretable as JS, but can't be interpreted as JS to actually be implemented efficiently -- which is to say, it's not really useful JS at all.
You may as well write a shell-script based ARM interpreter and call your ARM assembly "shell script". True, in a fashion, but also a totally pointless distinction. Everybody else will just call it "slow".
> ~2x slower than native is not optimal, sure. But it is in the same range of things like Java and C#, popular languages for mobile development (Android, Xamarin/Mono, etc.). And again, 2x is the starting point.
Android has the NDK for a reason, and even they are starting from a better position than trying to eek out runtime+rendering performance from an in-browser bytecode. Maybe on the web we'll get NaCL to serve the same purpose.
It'll be great if it all works out and we wind up with a cross-platform efficient and useful development target, with robust view management and event systems and common widget toolkits. Of course, at that point the web will look pretty much like the application development environments that we've been targeting for decades: operating systems.
> It's not JavaScript. It's a custom VM built around a (bizarre) bytecode.
It is most definitely JavaScript. It runs perfectly in other browsers - in fact it often runs faster than non-asm.js compiled code, so it is worthwhile even without new optimizations for it.
It is also, in addition, optimizable like a low-level VM.
> Maybe on the web we'll get NaCL to serve the same purpose.
asm.js, by design, can have pretty much the same level of performance as NaCl. Both can use the same sandboxing mechanisms, for example. Unless there is something specific in the asm.js spec you feel is preventing some additional level of optimization that is possible in NaCl - if so, what?
> It is most definitely JavaScript. It runs perfectly in other browsers - in fact it often runs faster than non-asm.js compiled code, so it is worthwhile even without new optimizations for it.
It can only run at speed if you interpret it as something that is not JavaScript.
In other words, the fact that it's valid JavaScript is essentially pointless, because it's totally useless (relative to actual native applications) unless you interpret it as something that's not JavaScript.
> asm.js, by design, can have pretty much the same level of performance as NaCl. Both can use the same sandboxing mechanisms, for example. Unless there is something specific in the asm.js spec you feel is preventing some additional level of optimization that is possible in NaCl - if so, what?
Stupid bytecode format aside (which incurs a cost for every single VM implementor, tool implementor, and developer that has to use the language, forever) ...
Here's a simple example: You can use NEON directly from NaCL/ARM. This matters on mobile. A lot.
Is asm.js going to define NEON and SSE intrinsics, too? If so, why the hell isn't asm.js just another output target for (P)NaCL so that people with Internet Explorer can run really really really really slow applications?
Or, how about the fact that NaCL can output ARM and x86 binaries directly, such that one doesn't need to AOT compile on every user's system, or introduce the cost, complexity, and overhead of JIT, when targeting standard/popular architectures.
How about this one -- I was going to write something up asking about thread-local storage and %gs-relative loads, or architecture-specific atomic operations on shared state, but then I realized -- asm.js doesn't define a threading model. At all. Unless I'm mistaken, it can't, because JavaScript itself doesn't define a shared state threading model.
Slow is not always "totally useless". It just degrades gracefully. Furthermore, the fact that it's based on JS makes the integration with the DOM much easier.
The bytecode format is not as big of a deal as you claim. 1,000 lines of code for the verifier.
There's nothing intrinsic to JS that forbids having SIMD support. In fact, it's under active discussion...
Having two back ends for PNaCl seems suboptimal. Why force all Web developers to generate two bytecode formats because we don't like the way asm.js looks?
AOT compilation isn't a panacea. You still have to verify it. So it's not like the NDK could just be adopted to the Web as is. Besides, caching can mitigate the startup costs a lot.
Finally, JS has threading, and it could grow shared state threading in the future. This doesn't strike me as that big of an obstacle.
> Slow is not always "totally useless". It just degrades gracefully.
Slow isn't graceful, especially when your competitors aren't slow. I look forward to Mozilla advertising Firefox as "only 10x slow as Chrome!".
In this case, it's even worse, because if asm.js is successful, then the fallback mechanism will simply become an awkward vestigial limb that's no longer required by any modern browser. Treat a vestigial limb as your first-order target is silly.
> Having two back ends for PNaCl seems suboptimal. Why force all Web developers to generate two bytecode formats because we don't like the way asm.js looks?
Because it's about users, not about developers. Generating optimized binaries means that users have a better user experience.
> Finally, JS has threading, and it could grow shared state threading in the future. This doesn't strike me as that big of an obstacle.
"I'll pay you tomorrow for a hamburger today" is getting old.
Slower is far better than not running at all. Backwards compatibility is how Web technologies (and, for that matter, most technologies--x86 for example) survive.
And asm.js' encoding has no impact on the user experience. That's precisely my point. Using a binary bytecode instead of interoperable JS would maybe make the parsing easier, but that doesn't make up for the costs this would burden developers with (not to mention the difficulties with two VMs sharing data and so forth, something which is terribly important but is always glossed over in these conversations).
Finally, regarding threads, we're talking about a new proposed standard here. There will always be work to do, since just taking the NDK and putting it on the Web is not an option. PNaCl isn't shipping yet. So the question is whether asm.js presents a shorter, more straightforward path to success than the alternatives. I believe it does.
Why? For the purposes for which asm.js is intended, the code may as well not be running at all, and users will be driven to upgrade their browser.
If you really want to support backwards compatibility, then make it a secondary target, not the primary one that you saddle yourself with for all time.
> So the question is whether asm.js presents a shorter, more straightforward path to success than the alternatives. I believe it does.
I believe NaCL and PNaCL provide a much saner path to an ultimately superior end, and would have a far greater chance of succeeding if Mozilla wasn't playing NIH to assuage their need to support JavaScript.
asm.js is competing with native, and they're starting out with an intentional disadvantage.
> For the purposes for which asm.js is intended, the code may as well not be running at all, and users will be driven to upgrade their browser.
That isn't true. Consider photo filters: you may want to have a photo filter written in asm.js for maximum performance on browsers that support it, but degrades to a suboptimal, but usable, experience on browsers that don't support it. Likewise, consider an asm.js-compiled mobile game: you may want it to be playable on desktops too, whose browsers generally contain enough horsepower to run JS-compiled mobile games at full speed even without special support for asm.js.
There are plenty of use cases in which "slower but backwards compatible" is extremely desirable. AAA games are not everything.
> If you really want to support backwards compatibility, then make it a secondary target, not the primary one that you saddle yourself with for all time.
Again, it's not worth forcing Web developers to compile multiple binaries, when the main difference here is how you parse the bytecode.
> asm.js is competing with native, and they're starting out with an intentional disadvantage.
PNaCl is starting out with the same disadvantage as asm.js, modulo surface syntax. In fact, asm.js arguably has an advantage over PNaCl. Starting from the JS AST that must be constructed by every browser anyway, the asm.js verifier is a mere 1,216 lines. For comparison, LLVM's BitcodeReader is over 3,000 lines. LLVM's Verifier is over 2,000.
> That isn't true. Consider photo filters: you may want to have a photo filter written in asm.js for maximum performance on browsers that support it, but degrades to a suboptimal, but usable, experience on browsers that don't support it. Likewise, consider an asm.js-compiled mobile game: you may want it to be playable on desktops too, whose browsers generally contain enough horsepower to run JS-compiled mobile games at full speed even without special support for asm.js.
You're competing with native, not with today's crappy JS webapps. "Just like native but slow!" is not a selling point to users that have a plethora of better performing, better integrated native applications to choose from.
You guys at Mozilla could just target NaCL/PNaCL, contribute asm.js as the second-tier means of pulling IE and Safari along with you, the whole industry moves past the disaster that is JS/DOM/CSS, and everyone is happier.
> Again, it's not worth forcing Web developers to compile multiple binaries, when the main difference here is how you parse the bytecode.
Again, it's ALL about users, not developers. Of course, it can be easy for developers too ...
... but at the end of the day, you have to stop thinking about the developers as your primary users if you want to produce a successful platform.
Developers matter -- but users are why we're all here. It's not as if Apple's users have been hurting because developers have to target multiple variants of ARM and learn a new programming language.
> PNaCl is starting out with the same disadvantage as asm.js, modulo surface syntax
That's not just surface syntax. As a tool maker, I spend an awful lot of time working on and with things that touch assembly/bytecode as "surface syntax". It matters, a lot, and JavaScript is a TERRIBLE bytecode syntax to have to deal with.
That said, it's also not the same disadvantage, because PNaCL implementations compile to native code; there's no fallback interpretation mechanism that exhibits the unusable performance profile of asm.js's fallback profile.
That said, I think asm.js would be a great second-tier target for PNaCL in the case that one wishes to target a backwards browser, regardless of how slow it is.
> You guys at Mozilla could just target NaCL/PNaCL, contribute asm.js as the second-tier means of pulling IE and Safari along with you, the whole industry moves past the disaster that is JS/DOM/CSS, and everyone is happier.
Moving past JS/DOM/CSS is just not possible. Backwards compatibility is how Web technologies survive. There have been many attempts to try to redo the Web from the ground up: XHTML 2.0 for example. They did not succeed.
> Developers matter -- but users are why we're all here. It's not as if Apple's users have been hurting because developers have to target multiple variants of ARM and learn a new programming language.
And users do not care about the surface syntax of the bytecode. If there were some user-facing advantage to not using JS as the transport format, then sure, it might be worth not using it. But so far I've simply heard "I don't like JavaScript syntax". It's fine that you have that opinion, but it's not worth sacrificing backwards compatibility for it, since it doesn't matter to users.
Regarding shipping native ARM and x86 code alongside the fallback mechanism, I've already explained why that won't work: developers won't test against the portable version, so it might as well not exist. The Web will be locked into those hardware architectures for all time. That might be a cheap way to compete with native apps in the short term, but in the long term it is bad for hardware innovation. (For example, consider that, as azakai noted, ARM might not have taken off at all if we had done that early on.)
> That's not just surface syntax. As a tool maker, I spend an awful lot of time working on and with things that touch assembly/bytecode as "surface syntax". It matters, a lot, and JavaScript is a TERRIBLE bytecode syntax to have to deal with.
It only matters to a small subset of developers. Besides, if you really don't like it, just write a converter that converts it to the mnemonics of your choice. It's really quite trivial.
By your logic, we shouldn't gzip executables, because
tools have a hard time reading gzipped assembly. But that's a silly argument: you just un-gzip them first. It's the same with asm.js. If your tool has a hard time reading asm.js, convert it to a "real" bytecode first.
Additionally, LLVM bitcode isn't really much better from a simplicity standpoint. As I already pointed out, the verifier and bitcode reader for LLVM is much larger than the asm.js verifier.
> That said, it's also not the same disadvantage, because PNaCL implementations compile to native code; there's no fallback interpretation mechanism that exhibits the unusable performance profile of asm.js's fallback profile.
Yeah. The fallback mechanism is it doesn't run at all. As I already explained, there are applications for which it is much better to actually run, despite reduced performance.
And "unusable" really is a stretch. There are many apps written in Emscripten that run just fine in current browsers. Like I already said, AAA games aren't everything.
Sigh. You web guys are completely blinded to the rest of the engineering universe. You actively oppose all attempts to do anything genuinely new, and then say "look! It never works!"
Meanwhile, the rest of us non-web people grow increasingly tired of even trying to contribute or explain anything, since every novel gets shut down. The end result is that you've created a perfect echo chamber of self-fulfilling prophesy -- the web technology stack remains stuck and broken, and you actually seem to like it that way.
I don't work on AAA games. I work on development tools and end-user applications, where having good tooling directly translates to better user experiences.
The fact that you think no-compromises performance is only the purview of AAA games is exactly why you have no business being an OS vendor. I hope -- for the sale of our industry -- that Apple, Google, and Ubuntu eat Firefox OS' lunch.
> Sigh. You web guys are completely blinded to the rest of the engineering universe. You actively oppose all attempts to do anything genuinely new, and then say "look! It never works!"
I oppose technologies (such as PNaCl) that I feel are worse than alternatives (asm.js). The reason they are worse is that they are not backwards compatible.
> The fact that you think no-compromises performance is only the purview of AAA games is exactly why you have no business being an OS vendor. I hope -- for the sale of our industry -- that Apple, Google, and Ubuntu eat Firefox OS' lunch.
I never said that no-compromises performance is only the purview of AAA games. I said that for most applications that are not AAA games, running more slowly is better than not running at all.
The problem with opposing improved technologies is, simply put, that JS/DOM/CSS/HTTP are broken crufty accidents of design, and enforcing compatibility with them forever is what has prevented the web from moving forward, has stifled innovation, and ultimately is why the web could lose the app war.
Imagine if we'd used a priviledged market position to religiously defend against the introduction of everything that wasn't backwards compatible with gopher? That's what you're doing to our industry today, and in that, you're almost as bad for our industry as Microsoft was bad for the web in the 90s.
> It can only run at speed if you interpret it as something that is not JavaScript.
By that argument any JS JIT is "no longer JS".
All JS JITS find cases where JS can be optimized as something simpler. For example CrankShaft and TraceMonkey find areas where variables are simply-typed and heavily optimize those.
This isn't surprising - to make JS be fast, you do need to find where you can make it go faster, by avoiding the "normal" JS dynamism where anything is possible. So the JIT optimizes it as something that is "not JS". Again, nothing new with asm.js there, JS JITs have been doing this since 2008 (and JITs in other languages far earlier).
> You can use NEON directly from NaCL/ARM [..] NaCL can output ARM and x86 binaries directly
Those are not portable, which JS must be. A better comparison might be PNaCl, which is like NaCl but has an intermediary portable format. PNaCl will of course have the same issues asm.js does with not having direct binaries that can just be loaded and run, not allowing use of CPU-specific code, etc.
If you don't care for portability, then the web/JS/asm.js/WebGL/etc etc. are likely not the best thing for you. Instead, a native app could make more sense.
JS JIT is an internal implementation detail. If a JIT exposed its internal bytecode (assuming it has one), you certainly wouldn't call it JavaScript.
asm.js's strict requirements expose that implementation detail. To make asm.js useful, you can't treat it as JS, in which case the fact that it's javascript is a pointless burden on the entire target community of developers and users.
> Those are not portable, which JS must be.
Users don't care about whether a particular application uses non-portable optimization strategies, they care about battery life and application performance. If that means I have to use NEON on ARM, SSE on x86, and provide a portable fallback implementation, then so be it -- as a developer, that's my job.
As a platform/OS vendor, your job is to make it possible for me to provide the best available user experience on the market.
This isn't the web development space; we can't just tell users to go get faster hardware, the way web developers tell IT to go get faster/bigger servers.
> PNaCl will of course have the same issues asm.js does with not having direct binaries that can just be loaded and run, not allowing use of CPU-specific code, etc.
You can target x86 and ARM specifically (which are likely to be the only architectures that matter for the next 5 years), and fallback to PNaCL for everyone else.
> If you don't care for portability, then the web/JS/asm.js/WebGL/etc etc. are likely not the best thing for you.
I don't care about sacrificing user experience for some irrational slavish devotion to JavaScript.
Of course I want portability, but not at the cost of providing the best user experience on the market.
asm.js optimizations are also an internal implementation detail. You don't need to do them, and the code still runs fast. Or you can do them in a variety of ways, not just the one being tested in Firefox. In fact I argued for optimizing in a very different way originally.
The point of asm.js is that it ensures you don't do things like use multiple types in a single variable, avoid undefined values cropping up, etc. That helps JS JITs in general, both existing ones as well as new optimizations made more feasible by the approach.
So asm.js does not expose any implementation details, no more than say CrankShaft and TraceMonkey do in the documents written about "how to write fast JS for modern JS engines" (which often say explicit things about "don't mix types" and so forth).
> Users don't care about whether a particular application uses non-portable optimization strategies
Of course users do. A portable application would be runnable from all the users' devices, that's a huge plus. Just like users want to play their music from their iPod, laptop, TV, etc., they want to run their apps on all their devices as well. Portability makes that possible.
> You can target x86 and ARM specifically (which are likely to be the only architectures that matter for the next 5 years), and fallback to PNaCL for everyone else.
That's a big compromise. If we had done that before the rise of ARM, for example, ARM might never have achieved its current success.
But anyhow, of course there are different compromises to be made. The web and JS focus on true portability, with its downsides. If you personally are willing to compromise more to get better performance, then sure, another option might be better for you.
> asm.js optimizations are also an internal implementation detail.
Come on, really? If you require a full spec to define a very specific format, type annotations, and special designators to actually take advantage of it in any meaningful way, it's not an "implementation detail", because as the user, I have to care about it.
> So asm.js does not expose any implementation details, no more than say CrankShaft and TraceMonkey do in the documents written about "how to write fast JS for modern JS engines" (which often say explicit things about "don't mix types" and so forth).
That's exposing implementation details, too, and demonstrates a failing of JS.
> Of course users do. A portable application would be runnable from all the users' devices, that's a huge plus. Just like users want to play their music from their iPod, laptop, TV, etc., they want to run their apps on all their devices as well. Portability makes that possible.
Users want apps to run, and run well. They don't care how. Figuring out how is our job. Making user's lives suck more because we have lofty ideas is not doing our job.
Apple gets it. Google gets it. Even Ubuntu gets it.
Mozilla doesn't get it.
> That's a big compromise. If we had done that before the rise of ARM, for example, ARM might never have achieved its current success.
Not really. Apple and NeXT navigated these waters successfully for multiple decades via Mach-O and CFM fat binaries, and toolchains built around easily and efficiently supporting multiple architectures.
> But anyhow, of course there are different compromises to be made. The web and JS focus on true portability, with its downsides. If you personally are willing to compromise more to get better performance, then sure, another option might be better for you.
The web is competing with native applications. Now you're trying to compete with native operating systems, yet you're not willing to take the steps necessary to actually compete.
Ultimately, you're creating a two tier systems where platform vendors like yourself get decent performance and runtime environments in which you can produce things like Firefox, and 3rd party developers get crappy performance and runtimes environments where we can produce webapps.
I'd love to see you write and deploy Firefox in asm.js, and then try to compete with Chrome.
> Come on, really? If you require a full spec to define a very specific format, type annotations, and special designators to actually take advantage of it in any meaningful way, it's not an "implementation detail", because as the user, I have to care about it.
First of all, it doesn't require a spec. We could have just done some heuristical optimizations like all JS engines have been doing since 2008, finding more cases where we can optimize and so forth - in fact, this was my initial idea for how to do this, as I mentioned earlier.
But we did decide to write a spec because (1) we want to be 100% open about this, and a spec makes it easier for others to learn about it, and (2) it helps us check we didn't miss anything because we have a formal type system.
>> So asm.js does not expose any implementation details, no more than say CrankShaft and TraceMonkey do in the documents written about "how to write fast JS for modern JS engines" (which often say explicit things about "don't mix types" and so forth).
> That's exposing implementation details, too, and demonstrates a failing of JS.
If so, then that exposes a failing of all JITs, including the JVM. All optimizing implementations expose details. People have optimized for the JVM for years.
If you can't stand anything between you and the underlying CPU, then nothing portable (like JavaScript, C#, Java, etc.) will satisfy you. Actually even a CPU might not, because CPUs also optimize in unpredictable ways, these same issues are dealt with on that level too.
Again, there is room for native apps. But there is also room for portable, standards-based apps. The web is the latter.
> Apple and NeXT navigated these waters successfully for multiple decades via Mach-O and CFM fat binaries, and toolchains built around easily and efficiently supporting multiple architectures
I do see your point, but that isn't quite the same. Apple fat binaries were of a platform Apple controlled. We are talking about the web, which no one controls. But again, yes, to some degree it is possible as you say to overcome such issues.
> The web is competing with native applications. Now you're trying to compete with native operating systems, yet you're not willing to take the steps necessary to actually compete.
I disagree. If we have 2x slower than native now, and 1.5x slower than native later on, we're competitive with native on that front. And we have some advantages over native, like portability, which can have long-term performance advantages (for example, we can easily switch to a different underlying CPU if a faster arch shows up). There are also short-term performance advantages to things like Firefox OS that only run web apps, like their graphics stack being much simpler than Android's or Linux's (you don't need another layer underneath the browser compositor, and can go right into GL).
> But we did decide to write a spec because (1) we want to be 100% open about this, and a spec makes it easier for others to learn about it, and (2) it helps us check we didn't miss anything because we have a formal type system.
If I'm going to target a toolchain at your runtime, and expect decent performance out of it, and expect to see consistent performance with other runtimes that also implement such behavior, then I (and you) need a spec.
Moreover, if I'm going to implement tooling that can make sense of your "byte code", then I absolutely need a spec that's more specific than "it's JavaScript that might be optimized".
This is making me more wary about being able to build reliable tooling around such a "bytecode", not less.
> If so, then that exposes a failing of all JITs, including the JVM.
Yes. But some languages and targets are much worse off than others.
> Actually even a CPU might not, because CPUs also optimize in unpredictable ways, these same issues are dealt with on that level too.
This is why we sometimes get down to carefull statement/instructions ordering to avoid pipeline stalls, or using architecture-specific intrinsics, or worrying about false sharing.
> Again, there is room for native apps. But there is also room for portable, standards-based apps. The web is the latter.
The web could be both, if Mozilla and other web die hards would critically evaluate the accident of history that is the modern web browser. What bothers me most of all is just how much Mozilla can hold back the industry. For example of where the industry could go, look at what happened with virtual machines implementations.
First, they were particularly inefficient, and relied on tricks such as trap-and-emulate. Not all that different from how NaCL is working, especially on ARM, with funny tricks like load+store pseudo instructions. Gradually, hardware vendors took notice, and we saw instruction sets and hardware shift to add enhanced VM-specific functionality (and ultimately performance) -- first with VT-x, and now with VT-d (eg, IOMMUs).
NaCL is in the perfect position to start down the road that leads to re-imagining what no-compromises sandboxed code looks like on the processor level. asm.js is going in the complete opposite direction, and in doing so, has the potential to steer the entire industry away from a path that could introduce significant and beneficial changes in the realm of security, portability, and open platforms.
> If I'm going to target a toolchain at your runtime, and expect decent performance out of it, and expect to see consistent performance with other runtimes that also implement such behavior, then I (and you) need a spec. Moreover, if I'm going to implement tooling that can make sense of your "byte code", then I absolutely need a spec that's more specific than "it's JavaScript that might be optimized". This is making me more wary of building tooling around such a "bytecode", not less.
Then if I understand you correctly, you are in favor of a spec for something like asm.js? But perhaps the problem you see is that you worry the same asm.js code will be slow or fast depending on the browser? Not sure I follow you, please correct me if not.
If I had that right, then yes, that's a valid concern, there will be performance differences, just like there are between JS engines on specific benchmarks already. Note that asm.js is much simpler to optimize than arbitrary JS, so that could decrease in time. But there are no guarantees with multiple vendor implementations.
And that's the real issue. NaCl, Flash, etc have one implementation, so you get predictable performance (and the same vulnerabilities...). But you don't get that with JavaScript, Java, C#, etc.
If NaCl were to become an industry standard somehow, then it would need to have multiple implementations, and have the same unpredictability in terms of performance. Except that it is fairly straightforward to optimize NaCl, so in theory the differences could become small over time - but the exact same is true of asm.js as I said earlier.
> NaCL is in the perfect position to start down the road that leads to re-imagining what no-compromises sandboxed code looks like on the processor level.
Again, PNaCl is a better comparison - even Google is shifting from NaCl to PNaCl (according to their announcements).
I see no reason that asm.js cannot be as fast as PNaCl, both are portable and can use the same CPU-specific sandboxing mechanisms. In fact it would be interesting to benchmark the two right now.
> Then if I understand you correctly, you are in favor of a spec for something like asm.js?
Yes. Imagine I'm writing an "asm.js" backend for a debugger, coupled with toolchain support for DWARF. To tell you the truth, I'm not even sure where I'd start, since it's not like the spec exposes a VM or a virtual machine state -- but if it did, I'd need the spec for that.
> But perhaps the problem you see is that you worry the same asm.js code will be slow or fast depending on the browser? Not sure I follow you, please correct me if not.
That's part of it. Without a spec, I can't really rely on remotely equivalent performance, but that's hardly the only toolchain issue.
asm.js is just JavaScript, it isn't a new VM with exposed low-level binary details. You wouldn't need to use DWARF or write your own low-level debugger integration. You can debug code on it of course, the right approach would be to use the JS debuggers that all web browsers now have, with SourceMaps support.
The goal with asm.js is to get the same level of performance as native code, or very close to it. But it isn't a new VM like PNaCl, it runs in an existing JS VM like all JavaScript does. That means it can use existing JS debugging, profiling, etc.
If x86 and ARM become the favored platforms for full speed on the Web, as you've essentially locked the Web into those architectures for all time. Web developers will realistically not optimize the PNaCl solution.
I disagree that this is worth the cost. This is not about "a slavish devotion to JavaScript"; your solution is fundamentally opposed to portability.
> Web developers will realistically not optimize the PNaCl solution.
It'll be "fast enough", which is what you're claiming for asm.js's backwards compatibility mode, and PNaCL is a whole heck of a lot faster than that.
> I disagree that this is worth the cost. This is not about "a slavish devotion to JavaScript"; your solution is fundamentally opposed to portability.
Users want applications that don't waste their battery, and that perform well. What do they care about architecture portability beyond the devices they actually have?
Fortunately, PNaCL solves that problem, too, as a fallback, while still being able to target x86/ARM without compromise.
"It'll be "fast enough", which is what you're claiming for asm.js's backwards compatibility mode, and PNaCL is a whole heck of a lot faster than that."
I haven't seen PNaCl benchmarks. Have you?
It is true that the compilers for asm.js are more immature than LLVM at this time, but there's nothing stopping asm.js from reaching that level.
Again, it's just syntax. You're complaining about the fact that the code is delivered in a backwards compatible surface syntax and extrapolating that to unfounded assumptions that it must be slow. It's really an absurd claim.
Spiritually it may as well not be. They usually stand on their own and stay out of the news when it comes to politics of Western Europe as a whole, since they were kept out of the UN post WWII. The biggest tie to the rest of Western Europe recently is that of organized crime; the Garduña of old evolved into the Comorra of today, well known across Europe, especially in Italy, obviously.
I wonder what kind of future expect the late entrants FirefoxOS and Ubuntu. I wish success for both. The challenge is big. They lack the momentum and big pockets of established rivals, Apple, Google and Microsoft. But there are some interesting factors that would be in their favor.
First are the apps. As for Ubuntu, developers love Linux. That would help to get apps ported. Plus Ubuntu have the recently introduced webapps integration, that provides unity integration for selected webapps. I expect that they will improve this feature, and include more apps. Then there is the choice of QT as the UI toolkit. This is a good choice, because QT is proven and has a big community of developers.
Meanwhile FirefoxOS relies on the traditional web platform, a popular and accessible environment for developers, that is currently improving relatively fast. The web, enhanced with the mobile APIs that Mozilla is working on, is probably enough for almost all productivity apps. And with webGL, it is becoming good for decent gaming. Web development has the advantage of being portable across platforms. That lower costs. And it is relatively easier and cheaper to find developers for it. These are facts that should not be ignored.
So both Ubuntu and FirefoxOS, are offering an attractive development proposition. From a pure development standpoint, I much rather prefer both Ubuntu and FirefoxOS alternatives, than Android Java or Apple objetiveC.
There is a strategical factor: Android device manufacturers fear Google. It is the most powerful web company in the world, controls Android, and also owns Motorola. So it competes with device manufacturers. That is too much power. And who knows if it decides to become like Apple at some point. So device manufactures, should be open to examine alternatives, as a countermeasure against Google excessive power.
Both Ubuntu and Mozilla are solid software companies, that seems unthinkable to become device manufactures.
As for windows, I can imagine device manufactures waking up from nightmares, screaming at midnight. Dreaming about a mobile world dominated by Microsoft, where they would have to pay a big fee for each windows license, out of their thin margins. It is also relevant the interest of big third-parties. Big players like Facebook and Amazon, would feel more comfortable, if their apps were running on OSs not property of their rivals, Google, Apple and Microsoft. Where these could implement strategies, to undermine the former.
Linux as a gaming platform is on the rise. Steam, the Ouya, Nvidia project shield, etc. There are more incentives to consider games for Linux. Making porting to Ubuntu mobile easier and more enticing. And webGL appears poised for a brilliant future. So both FirefoxOS and Ubuntu could be good gaming platforms.
I hope that with the help of these factors, they at least can establish a sustainable presence on the market, and are able to build from there. I would be happier with a world where open OSs are present on the mobile market.
I could see Ubuntu become the main alternative to Android for manufacturers, because Canonical will make it easy to piggyback on Android drivers, so it should be very easy for manufacturers to port Ubuntu to their devices. It also gives them at least as good customization power as Android, and the hacking/ROM community are going to love it, which I think is a good way to get people excited about a device/OS online these days.
WP8 doesn't give them any of that, it's been stagnating at 2% for more than 2 years and a half, and besides perhaps helping Nokia a bit (while being a much smaller company than they used to be, so easier to be "satisfied" with it), there's nobody really benefitting from supporting it in terms of sales. So Ubuntu would be a great WP8 replacement there, and as a strong customizable alternative to Android.
As for Firefox OS, I guess it depends on how well this ChromeOS/FF OS thing will work. And if it does, it probably only works on the low-end, at least for the next 5-10 years, when data will be so cheap, that using web apps won't be a problem anymore (although you can still use "native" web apps in FF OS, just like in Ubuntu Touch, so I guess there's that, too).
But since FF OS is going to be used mostly on low-end devices, that means it could be on a lot of devices as well, so it could do well in terms of market share (or at least well enough to give a company like Mozilla good revenues). For the same reason, I don't really see FF OS and Ubuntu competing with each other, since Ubuntu will be more on devices with at least a quad core A9/A53 or dual core A15/A57.
The only way they are competing with each other is manufacturer's attention. Some of them may not be ready to take on more than 2 operating systems at once, and if they have already committed to FF OS, then it might take stronger persuasion to start supporting Ubuntu as well, to have an alternative for high-end Android devices.
Android will continue to compete with both, at the low-end and at the high-end, although I think Google needs to make Android 5.0 a little leaner to put it back into the Gingerbread-range of resources needed, if they want Android to go hard into the sub-$50 phone market.
I guess it's good that Firefox is doing this, but kind of like the Ubuntu mobile project, I just don't think it's going to make much of a dent.
Consumers don't care about technology ideology. Native vs web, NOBODY CARES!
Consumers want their apps. They want to play games. They want instagram. They want beautiful hardware and lots of accessories.
The only place in major markets where a Firefox OS stands a chance is low cost prepaid like Virgin Mobile USA, Straight Talk, etc. but there are enough good android options and even iPhone now, that it might be too late.
Maybe it could take off in some other countries, but I don't see Firefox OS taking off like the original Firefox did.
> Consumers don't care about technology ideology. Native vs web, NOBODY CARES!
You are absolutely correct. And when they sell these for $50 a pop in Africa, South East Asia and Brazil people will be happy they didn't have to buy the $600 iPhone or Android.
These devices aren't even schedules to arrive in the US till mid to late 2014. And by the time they do they may have a strong enough foothold with developers and existing websites to have leverage with the US carriers.
As I understand it, Gecko has significantly lighter resource demands compared to Dalvik. Andy Rubin himself, speaking on FirefoxOS, has said that there are simply places where Android cannot go.
The trick will be if they can position FirefoxOS as a capable product parallel to iOS and Android without letting the cheaper price tag tarnish it's image.