Hacker News new | past | comments | ask | show | jobs | submit login

How much does this limit the code you can write? Presumably WASM is made safe by limiting the ability of the code in some way. Otherwise the checks that occur as part of the C -> WASM -> C pipeline could be built into your standard C compiler.

So what do we lose as part of this round trip?

Just speed? Nothing else?




You could build Software Fault Isolation into an existing compiler toolchain, and people have done it, but WebAssembly comes with an existing constraint which is why it's designed the way it is: a WASM file may be generated by an unknown or untrustworthy participant and needs to be consumed in a trustworthy context, i.e. served from a random HTTP server into a users browser. Therefore the WASM file format has stringent requirements and a strict validation algorithm placed on it before you should execute it.

So, you're doing a classic thing in verification, which is just writing a "checker" that implements some (known good) algorithm in the smallest and most correct way possible, and then using that to check whether much bigger "unknown" things are safe. The goal is that the checker is much easier to implement correctly than auditing everything by hand.

For example, you might have a media player with extensions (like foobar2000); traditionally extensions would be delivered as .dlls, because developers did not release the source. This would be a use case similar to the browser, where WebAssembly would be a good choice instead of random .dll files. They may not want to release the code, but you don't want to trust a random blob of binary code. If you trust your WASM implementation, you don't need to trust that the binary blob is harmless (it will just be rejected or forbidden from doing bad things.)

If you're not dealing with "random binary blob from potentially untrusted source", i.e. you are running the compiler yourself on some code you downloaded, and then running that code, then you don't really need WASM for this, because you could reasonably trust the compiler to uphold the security guarantees using SFI techniques. For example, if you wanted to make sure zlib was safe from buffer overflows in your main process, to reduce blast radius, a pure SFI toolchain would be fine. You can trust it works and then just compile zlib yourself.

But there's generally a lot more mindset and movement around WASM than anything else, so people use it for all of these cases, even cases where they control both the compiler generating code, and where the code is being run.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: