Rust (and other fast, compiled languages) is the future of Javascript infrastructure [0]. Whereas before one might have thought that a language's tooling should be written in said language, there might come a time when that's too slow, and we instead need to move to faster languages.
That's what has happened to Python (numpy, pandas, scipy etc are all written in C and simply provide an interface in Python), and now it's happening to Javascript as well, with Deno, swc in Rust, Bun in Zig, esbuild in Go, and so on.
I foresee a future where we won't have to deal with slow tooling and can have instantaneous updates once again.
Eventually you realise this split sucks though. it creates a world where programmers of the higher level language can't reasonably debug their most important libraries. This is different from not being able to debug the runtime, which has a smaller API surface than important libraries and frameworks generally have (though I guess it depends on if you count stdlib etc sometimes).
It's generally much better to simply go to a language that still operates at the same or similar level of abstraction but with higher performance. i.e going from Typescript to Kotlin isn't a substantial move from an abstraction level perspective but you no longer consider needing to write the code in C because it's not fast enough. (you might for other reasons, like runtime portability)
I would argue this means the users of the higher level language need to learn the language their tooling is written in, or consider switching to the underlying compiled language altogether where possible.
My entire career has been with scripting languages, but I think the pendulum is swinging again and they are back on notice.
Anything that isn't strictly a scripting need is orders of magnitude more ergonomic to do in a compiled language than it was a decade ago. Why fight it?
I'm still watching the developments of WASM closely, because while JS does the trick and always will, if you can build an app in a language that produces an equal or better result in WASM, why wouldn't you consider it from an engineering point of view? Then there would no longer be a language split.
There's a lot of "ifs" here for sure, but the possibility exists that the languages we currently use are not the best for the task. I think when we see projects like this, it's a good opportunity to reflect.
Having fast libraries is great. No one ever wishes that numpy was written in python.
And switching from typescript to kotlin? That's changing platforms. If you're in the browser or in node and replace it with the JVM? That's a wild suggestion to avoid some fast libraries.
Totally agree with the 'It's the platform, stupid'.
However, I do wonder if for javascript/typescript that's only for 'in browser' stuff.
If you look at the server - one of the best platforms is the JVM - lots of quality libraries - easy to write stable long running processes etc.
ie if you are driven by platform isn't it Javascript on the browser and Java on the JVM on the back?
( typescript or kotlin are both moving a bit away from the platform ).
The beauty of the network ( message passing ) means the stuff on the server doesn't need to be in the same language as that on the client.
Now I'll admit there is some benefit to having front and backend the same, in terms of validation code, or perhaps occassionally moving logic from server to client or back - but does this override the platform effect? Also note you can run js on the JVM/GraalVM - so client/server validation libraries could be written in js and reused both sides.
JVM is not good for tooling, at least not in comparison to Go or Rust, which are both faster to write, easier to distribute and faster to execute for short lived CLI invocations.
How does that work if end users want to use JS though? They're not all going to move to Kotlin. Since one must meet people where they are, rather than where we want them to be, the best we can do is instead have the end user still program in whatever language they want but then translate that into a higher performance language. That's basically what Python is doing now. Users don't want to write C/C++, they want to use Python, but Python is too slow, so here we are.
What you're describing is known as the "two language problem". The long-term solution is to move to fast, high-level languages. I think Julia and LuaJIT are the only examples currently, but surely more will crop up. This is s fundamental problem that will not go away on its own.
Efforts like Numba can alleviate some of the pain of a slow Python, but it turns out it's generally not possible to retroactively fit a JIT compiler onto a language designed not designed to be compiled.
In my experience, once I tried moving my work code to Julia, it became clear to me just how much energy I had spent trying to overcome the performance wall of Python, and I can't imagine going back now.
second time this week i've had occasion to mention lush[1], which was in many ways an early attempt at what julia is doing now. from their homepage:
8<--------------------------------------------
Lush brings the best of both worlds by wrapping three languages into one: (1) a weakly-typed, garbage-collected, dynamically scoped, interpreted language with a simple Lisp-like syntax, (2) a strongly-typed, lexically-scoped compiled language that uses the same Lisp-like syntax, and (3) the C language, which can be freely mixed with Lush code within a single program, even within a single function. It sounds complicated, but it is not. In fact, Lush is designed to be very simple to learn and easy to use.
IIUC, the performance of PyPy is still quite far behind normal compiled languages, due to limitations inherent in Python. And then it has limitations in its interop with Numpy and C libraries, at least last I checked.
That's the general picture when trying to overcome Python's performance wall: Sometimes you can use Cython, or Numba, or PyPy or Numpy vectorization, or call into C. But only sometimes, and they each come with their own set of awful restrictions and caveats.
It's such a breath of fresh air to switch to a fast language and just forget all those hacks and workarounds. I get why people just migrate to static languages
> I foresee a future where we won't have to deal with slow tooling and can have instantaneous updates once again.
You quote Python examples around data processing (an appropriate niche use-case for perf-oriented implementations), but then your final line is actually about build tooling.
The problem here is that things like esbuild are written to support dysfunction. Javascript-written build tooling build "reasonable" projects very fast. The dependency bloat in the NPM ecosystem is well-documented, and the actual level abstraction complexity of apps/frameworks/projects is also commonly considered to be excessive. This leads to slow builds, slow apps, and low maintainability/debuggability. Solving the first of those problems in isolation "supports dysfunction" in that it makes the latter problems less likely to be solved. It's especially bad when that solution comes with further maintainability/debuggability compromises.
Slow tooling, slow runtime, it's all the same to me, I want fast programs. Not sure what you mean by esbuild written to support dysfunction, could you elaborate on why that's the case? If you mean that well-developed JS-made tooling can be on par with esbuild and similar, I will have to disagree as benchmarks show that compiled languages like Go and Rust are orders of magnitude faster, and whatever architectural improvements can be made in the JS-made tooling can also be made in these compiled-languages' tooling as well.
Writing software in a fast language -vs- whatever quick-to-write-/maintainable-/domain-specific-/sandboxed-/etc. language is always some kind of compromise of perf. vs whatever your other priorities are. In particular, writing tooling for a language in that language will have massive advantages in terms of contributors, but may have perf. drawbacks if that language is not as performant as alternatives. Weighing up those pros & cons is important - the performant option is not always the best, it depends on your priorities.
Performance generally increases in priority depending on how much of a bottleneck it is. If webpack takes 0.2 seconds and esbuild would take 0.02 seconds, that's a 10x perf. gain but is probably not a compelling reason to switch. If webpack is taking 20 seconds however, that's a bottleneck that's worth looking at.
My point above is that an app that takes 20 seconds for webpack to build is overengineered in the most common cases - taking that badly written app and running it through esbuild to "fix" your problem isn't really fixing your problem, it's just hiding it under the bed. This is what's called "supporting dysfunction".
Numpy, pandas, scipy are poor comparators because they're processing data, not code: they're tasks that depend on data scale, rather than on how many over-abstracted layers of code someone is trying to compile all at once. The former is not (necessarily) a sign of dysfunction. It's much more likely to be a valid use-case.
Tools for interpreted dyntyped languages can be fast, helping the dev flow (edit-run-check) in those dyntyped langs to be fast.
But those tools are then in written in statically typed langs that do not have such fast dev flows (as they have an extra step: edit-COMPILE-run-check).
I wonder how we can have the best of both worlds. I see a glimpse of this using Kotlin: I can use the JVM to run in debug mode and have very fast compile times due to incremental builds and hot code reloads), then I can compile the code to "native" (binary) for production scenarios.
This way we can have strong type guarantees AND fast dev flows.
Dart does basically the same thing, as used in Flutter debug vs release builds. However, this requires people to switch to such a language. If they want to continue to use JS, then we can only provide tooling around it rather than making them change their preferred language entirely.
This is a false dichotomy. Plenty of Javascript projects build slower than a comparable Go project, because of webpack etc bloat and not caching results as well.
Erlang is a bytecode VM, so "compiled" for sure, and is pretty much the pioneer of hot upgrades.
Plenty of Lisps and Schemes are compiled (though often to bytecode) and can typically dynamically replace parts of the program.
I think ability to hot upgrade seamlessly is more a statement of "how much of the data structure shape is carried at runtime" / "how likely it is old and new API data structures happen to interoperate", and how likely it is that one can convert old runtime state to new runtime state. And that one can be seen as a trade-off; close to the metal control over memory layout is a performance gain, but in practice trades off this kind of flexibility (in theory you could make it work, but it's probably a lot of work).
https://www.theseus-os.com/ is an experimental kernel that can restart/reload/upgrade Rust components at ELF library boundaries. (State internal to a component has to be discarded unless you program a converter. Then again most JS web development hot reload discards internal state of a component.)
In your linked article they claim that parts of key JS infra is being rewritten in Rust, but the claim seems exaggerated from where I sit. For instance, they list webpack but is that a good example of huge momentum to rewrite webpack in rust?
You Python examples are of libraries that are used at runtime, not infrastructure.
Coming back to JS, this svelte project is noted by the author as being very early stage and it seems more like a learning project for them. I don’t need to add that svelte isn’t nearly as broadly used as other JS tech.
Don’t get me wrong, I’m not saying writing tools in faster compiled languages shouldn’t be done, but one might find that the excitement and prediction seems premature. There are trade offs - speed isn’t free.
Yes, see swc and esbuild (in Go) for examples. They are taking off in the JS world such that many other frameworks are eschewing Webpack and wrapping them due to their sheer speed, like NextJS with swc and Vite with esbuild.
> libraries that are used at runtime, not infrastructure
Sure, I was just giving an example. I would say though that due to JS being interpreted (okay, JITed), the infrastructure would have to run Webpack anyway, in order to bundle/compile the JS.
> this svelte project is noted by the author as being very early stage and it seems more like a learning project for them
I'm not talking about this project in particular, just the JS ecosystem as a whole is trending towards compiled-language tooling. But I will say though that this trend is not early stage as the examples above like swc/esbuild have been some years in the making. If anything this will simply accelerate such progress.
Just a warning that Svelte kit is going through a major refactor and any work you do will have to be ported, with a lot of changes required to file structure and some functionality.
I ported an Angular Universal app over to Svelte after one of the Angular major upgrades failed due to an Angular Universal bug that been around for years. Working with Svelte has been been great. I also like custom architecture since large apps will need custom architecture anyways. After porting the app over, it was easier to refactor the code without a heavy framework like Angular getting in the way.
I also maintain a large app in SolidJS. I like SolidJS even more than Svelte, especially for large projects but also for small projects.
Svelte has some disadvantages with tooling & some issues with Typescript integration. SolidJS allows more function decomposition & general flexibility in creating smaller more focused components, since the jsx/tsx components are plain old javascript functions. In addition, SolidJS javascript output is smaller than Svelte when the app hits a fairly low level of complexity. The engineering of SolidJS is more accessible than Svelte, so it's easier to understand what is going on.
Came up on /r/rust two weeks ago and Svelte creator Rich Harris responded:
> This project sounds awesome, and I'm sure we'll be able to learn a lot from it. Just a heads up that we will likely be making substantial changes to the compiler for Svelte 4, resulting in very different output JavaScript — thought I should mention that in case it affects your plans!
That's super nice looking Rust. You caused me to revise my opinion the Rust had ugly syntax. I got the impression when I wrote a file system tool in Rust a few years ago. I see things have improved.
This project isn't even close to being functional and yet it already has 750+ stars on Github. I guess that's one way to test the market for a project idea.
(Not OP) That repo looks like a proof-of-concept which is linked to from an open issue in https://github.com/swc-project/swc which is apparently a "super-fast TypeScript / JavaScript compiler".
I really hope they don't follow the rustc 'OO' style of defining passes as traits. Pattern matching gives incredible tools for compiler pass writing, and the ideology of Rust compilers is ripe for being brought into the fold of OCAML-style compiler patterns.
Yes, most of the .rs files in https://github.com/pintariching/rustle/blob/main/src/compile... are empty. I was looking as I've been using the nom [0] parser combinator create to build an experimental compiler and I'm curious to see how other Rust compiler projects are doing parsing.
And submitting a post to HN with the format “{something popular} rewritten in Rust” is always going to go straight to the top, even is they haven’t completed it yet, or maybe never will.
The interesting thing here is really the reaction from the HN community, not the incomplete implementation.
Quite right, HN in incredibly susceptible to “fashionable” tech. Now that Zig has a significant VC backed flagship project, I suspect Zig will be the new Rust to HN.
I would also add that rewriting something that exists in a new language is both a brilliant way to learn that language but also, if the language is still evolving, feed back improvements to the language itself.
I dunno, Zig feels like a big step forward from C, but also a step back from Rust. I'm not sure the people who have already moved from C++ to Rust will want to go back to debugging segfaults.
While you're right about Zig being a step back from Rust, a lot of people are not willing to make the large step to go from C to Rust (maybe from C++, the step is much smaller, but Zig is in the C league, not C++).
I think it will be successful despite that, because we've seen this happen before in other ecosystems... e.g. Scala was a large step away from Java, while Kotlin is a much smaller one, and in a way, you could also describe it as a step back from Scala. Yet, despite being younger, Kotlin has already surpassed Scala according to some rankings and has some major projects already (maybe more than Scala) investing heavily on it, like Android and Gradle.
Perhaps, another example is Elm... much simpler than Haskell or PureScript, but it seems to be by far the most popular functional language on the frontend.
Finally, I've written some Zig and definitely didn't feel like I have to debug a lot of segfaults... the tools to avoid that problem are already pretty good (the debug memory allocator is almost as strict as the Rust borrow checker and will keep you on check!) and even having little experience manually managing memory, I was able to get stuff done quite easily (unlike with C which I do find extremely dangerous on my own hands).
For me, Zig is Modula-2 with C syntax, and some cool metaprogramming capabilities, much better than C will ever be, but I expect something better than what a 1978 systems programming language was offering in regards to safety (or NEWP from 1961 for that matter).
I guess in the end should be happier that more Zig and less C gets written.
Rust is a lot simpler than C++ or Scala. It has some upfront complexity wrt. C or Zig, but that pays off when dealing with larger programs, where avoiding subtle bugs becomes much harder.
Now do C++! He didn't say it is simple; he said it is simpler than C++.
Now, I'd say they're on the same order of magnitude for simplicity. The huge advantage Rust has is that it tells you when you got it wrong, whereas C++ says "OK, I will put the toast in the fridge."
I like Zig, but its lack of RAII or GC puts it in the same camp as C. I would definitely prefer it over C, but not C++, let alone Rust. Who wants to deal with manual deallocations again? How about manual ref counting? Those feel like archaic problems to have to deal with again.
Some proportion of programmers are quite sure that "debugging segfaults" wasn't a problem. Most of them didn't move to Rust though of course.
The people behind Jai and Odin have both expressed the opinion that the problems Rust prevents are not important. I would tend the pigeonhole this as "Real programmer" macho bullshit, especially from Jonathan, but I could be wrong of course.
Svelte is a frontend framework that isn't React, Angular, or Vue. This work is towards an installer that isn't Node. If you do frontend work and don't like React, Angular, Vue, and Node, this is all the things you like.
(It's me, I dislike frontend frameworks and node and like svelte)
It's also a frontend framework that builds custom client-side JavaScript, which is where the "compiler" part comes in. This makes it way more efficient on the browser than the generic VDOM-diffing approach found in React, etc. Almost as quick as a static site, while preserving full interactivity.
Yeah this appears to be a repo with exactly one commit in it. Hopefully the author follows through and makes it but for now it's not really worth judging.
Very impressed with the momentum around Rust. Interesting that Netscape/Mozilla has given us two widely used languages, at opposite ends of the spectrum in pretty much every respect.
I, for one, would love to see this come to fruition one day. Tools like esbuild have proven that front-end tooling does not need to be written in JavaScript, and seeing a Svelte compiler in Rust of all languages means that there could be some real innovation coming to this space.
It is so tiresome to read this comment on every post dealing with Rust. How is the language that the compiler is rewritten in not relevant? Should the title honestly be "Svelte compiler rewritten in a different language" so we can avoid using the word Rust?
The reason for all the hype around Svelte is exactly because it reinvents the wheel. (Rust also does this.) It proves that it's possible to create a web application framework that outperforms a Virtual DOM (VDOM), and does this by giving you one.
Usually, you do minimal transpilation when you create, say, a React app - limited to converting JSX to React.createElement calls, plus bundling and tree shaking (if possible). That means all of the work needs to be done at runtime, and React needs to keep track of a VDOM, where any changes can be made at any time, and there needs to be generic runtime infrastructure in place to handle that (diffing and reconciliation).
Additionally, each render completely reconstructs its slice of the VDOM, just to be diffed, leading to many wasted objects and CPU cycles even if nothing's changed.
Svelte is a different approach which sees the inefficiency in that and tries to implement an alternative. Sure, it's "new and shiny" (even though it's been over 6 years since 1.0), but it has some real benefits!
Svelte works by rewriting your code to make changes directly to the DOM, rather than going through a VDOM, and it does this by keeping track of all the possible changes at compile-time (along with things like event listeners and reactive state and so on) and hard-coding them in.
So instead of getting a generic runtime framework that has to run everything through a VDOM, you get something a bit more similar to what you would have if you wrote the whole thing by hand using vanilla JS with no framework at all.
Svelte does things in a fundamentally different way than existing frameworks. It is very lightweight, both in the code size sent to the user and in terms of runtime computation, entirely ditching the VDOM by doing more at compilation time, requiring less resources, less powerful devices for a better result (because less janky). It is also quite intuitive for the developer. I'd like to see it spread or at least these ideas adopted elsewhere. We need a lighter web and Svelte could be part of the answer to this.
It's also not exactly new, from 2016.
I hear you about the JS frontend framework and tooling churn, but Svelte actually brings something new and valuable.
Svelte isn’t new but it sure is shiny! Svelte empowers me to write faster apps faster. And it’s loads of fun. The value proposition goes from the bottom (better DX), to the middle (better end-user experience), and up to the top (ship faster). I’ve built a career out of Svelte as a freelancer for years and have been loving it.
That's what has happened to Python (numpy, pandas, scipy etc are all written in C and simply provide an interface in Python), and now it's happening to Javascript as well, with Deno, swc in Rust, Bun in Zig, esbuild in Go, and so on.
I foresee a future where we won't have to deal with slow tooling and can have instantaneous updates once again.
[0] https://leerob.io/blog/rust