Actually it turns out that having an interpreted runtime causes a stream of endless problem. We keep try to patch them over but new problems keep coming up.
It turns out that a standarized low level byte code is the right way to ship cross platform applications. We already knew that. It just took a long time to go through the standarization process and be implemented by browsers.
If it seems like it's not in use today, it's largely because JavaScript has momentum and the majority of web programmers don't yet know how to take advantage of wasm (or maybe don't think they need it).
> It turns out that a standarized low level byte code is the right way to ship cross platform applications
So why hasn’t it taken over the world with Java and now with wasm?
> If it seems like it's not in use today, it's largely because JavaScript has momentum and the majority of web programmers don't yet know how to take advantage of wasm (or maybe don't think they need it).
There’s another possible reason. I’m sure you can come up with the answer yourself.
Javascript never really had a chance to "take over the world" before Google released Chrome and the V8 engine. Before that it was just too slow.
Java applets really sucked. They took a long time to load and initialize. They all looked ugly (probably because of the default libraries?).
Also Java itself as a language was really bad and the development experience was awful. I want to say that no one wants to program in Java but in reality many people do (I don't understand those people).
The important lesson here is the implementation is more important than the idea.
Good idea with bad implementation -> goes no where.
wasm is not java.
The important quality about wasm is that it's not garbage collected. It's pretty close to just good old assembly.
>Javascript never really had a chance to "take over the world" before Google released Chrome and the V8 engine.
Chrome was released at the end of 2008, JavaScript was thriving way before that. We had Gmail since 2004, jQuery since 2006. WebApps were the “sweet solution” considered for iPhone apps at first in 2007. Chrome exists because of the healthy ecosystem that Firefox and Safari provided, not the other way around.
>It's pretty close to just good old assembly.
Precisely, and it's yet to be proven that's the best solution for the Web. I love it from the computer science perpective, but historically that idea hasn't struck a chord.
Actually it turns out that having an interpreted runtime causes a stream of endless problem. We keep try to patch them over but new problems keep coming up.
It turns out that a standarized low level byte code is the right way to ship cross platform applications. We already knew that. It just took a long time to go through the standarization process and be implemented by browsers.
If it seems like it's not in use today, it's largely because JavaScript has momentum and the majority of web programmers don't yet know how to take advantage of wasm (or maybe don't think they need it).