Sounds like a great way to burn money: the company was founded in 2019 and they still haven't released anything, probably because they've spent most of their time building other stuff. Their devs sure are having a lot of fun though.
That was my issue for a long time. I even talked with their founder several times on twitter back a few years ago. Each time I was greeted with buzzwords, that I knew the meaning of but I think they assumed I didn't.
They would claim grand things like having solved the issues with continuations and delimited continuations, distributed process migration and a whole host of other very hard problems that haven't been solved in the past. I would ask their founder: "right, so you know that delimited continuations have problems with accidental captured scope, they run poorly on the JVM, how did you solve this, have any papers I can read?", and all I ever got was that Clojure, immutable data, X and Y would fix these issues and you just had to wait and see what they were cooking up.
That's when I knew they had no clue what they were doing. I'm all for pushing the boundaries of tech, but if you're doing something attempted many times before, you at least need a good elevator pitch as to why it's solved now.
I think a great example of this done right is Rich talking about Clojure. People would ask "isn't immutable data expensive to reclaim and allocate". And his reply was always that the JVM's GC was just that good that the benefits to be gained from immutable data outweighed the marginal performance penalty of the amount of garbage collected. What changed since the old lisp days? Well we now have GCs that are super fast and JITs that can optimize dynamic code well.
That's the sort of laser focused vision I never saw from Red Planet labs. You gotta get that problem statement and the solution out early, refine the elevator pitch and be able to articulate to people who know what they're talking about how you're going to succeed where others have failed for decades.
I see your overall point which is a good one and I know this is a nitpick, but I thought Rich’s core solution to the cost of immutable data structures was to find a way to get the cost down by extending some existing research by Phil Bagwell.
“I then set out to find a treelike implementation for hash maps which would be amenable to the path-copying with structural sharing approach for persistence. I found what I wanted in hash array mapped tries (HAMTs) [Bagwell 2001]. I built (in Java) a persistent implementation of HAMTs with branching factor of 32, using Java’s fast System.arrayCopy during path copying. The node arrays are freshly allocated and imperatively manipulated during node construction, and never mutated afterwards. Thus the implementation is not purely functional but the resulting data structures are immutable after construction. I designed and built persistent vectors on similar 32-way branching trees, with the path copying strategy. Performance was excellent, more akin to O(1) than the theoretical bounds of O(logN). This was the breakthrough moment for Clojure. Only after this did I feel like Clojure could be practical, and I moved forward with enthusiasm to release it later that year (2007).”
Or you tell me what I’m missing. Big fan of your work in core.async if this is the same halgari.
Yeah he took it from Bagwell, and adapted it, but in general there was a whole discussion way back in the day (~2012) questioning how creating this much garbage by boxing and throw away collections could ever be fast. Datomic is another example: making an immutable DB is a dumb idea right? Well what if storage was super cheap, and almost free? Well then maybe it's not such a bad idea.
So a lot of the Clojure community is based on this idea of taking ideas from way back in the 70's and saying "Well everything has changed, what works now that didn't then"
That’s super interesting and makes sense - even with persistence of trunks and branches there will be leaves to throw away / GC. Thanks for explaining!
> People would ask "isn't immutable data expensive to reclaim and allocate". And his reply was always that the JVM's GC was just that good that the benefits to be gained from immutable data outweighed the marginal performance penalty of the amount of garbage collected. What changed since the old lisp days? Well we now have GCs that are super fast and JITs that can optimize dynamic code well.
It's a little odd to see this deeply mistaken belief dating back to the early Java days being advocated for today. GCs are very constrained by the tradeoffs they make, there's no free lunch.
Much in the way an F1 car is only fast on a race track specifically made for it, the only reason a massive rate of allocations can have a merely marginal performance penalty is if the GCs in question have been specifically designed to handle it. But in doing so, they must have made sacrifices elsewhere, e.g. to memory usage.
Code that's not written with the underlying machine that will ultimately execute it in mind will never be fast, no matter how much we jiggle tradeoffs around. Therefore, while the gains of immutable structures may still outweigh the performance loss, the loss cannot possibly be characterized as marginal.
Oh for sure, it's a bit wonky on the JVM due to the lack of tail calls, but that sort of thing can be done via full-stack bytecode transformation.
But these people are doing this in Clojure, which is quite removed from the JVM bytecode, and talking about how it solves so many distributed problems, which I just don't see happening.
If you look at redplanetlabs github repo, there's a ton of low level manipulation of 'JVM bytecode assembly language', e.g. https://github.com/redplanetlabs/defexception/blob/master/sr..., for projects that aren't even compilers, one would assume their "compiler" does this even more so.
The thing i want to know is how is this company funded? What puts food on the table? Or is it just living off personal wealth? Or is it investor funded, in which case, how come the investors are not clamering for a return on investment?
We're funded by top investors including Initialized, Naval Ravikant, and Max Levchin. They've seen firsthand what we're building and understand both how hard it is and how valuable it will be.
The founder has released successful large projects in the past, e.g. Apache Storm used by Alibaba, Yahoo, Twitter, etc. Probably just need to wait a bit longer.