Tangential question, but what Lisp is the most promising one to invest time on right now?
Common Lisp still has some activity, mainly coming from SBCL, but is a bit stagnant. It has a fantastic literature and many mature implementations.
Scheme is great, but a bit too fragmented. Racket seems to be gaining some momentum. The merge with Chez may be the tipping point to attract a critical mass of developers.
Clojure has many interesting modern ideas, but I feel being so tied to Java and the JVM has hurt a bit in the long run.
I miss a bit of innovation in the typed lisp area, like e.g. Qi or Shen which never took off. Carp [1] looks nice.
We use Common Lisp to build compilers and simulators for quantum computers at Rigetti [0,1].
I wouldn’t consider the development stagnant. SBCL makes regular releases, including huge improvements (e.g., ported to risc v) and bug fixes. It’s easy to find commercial support for Common Lisp as well.
I love Scheme and Common Lisp both, but CL is quite a workhorse when it comes to difficult, professional work on teams.
Idiomatic Clojure is also really slow compared to SBCL-compiled Common Lisp or one of the faster scheme implementations. The compiler's fairly basic and it relies on the JVM for optimisations, which isn't great at the kind of optimisations a compiler aware of higher level semantics can do. It is possible to achieve decent performance by using it as basically a Java wrapper, but that's much uglier than idiomatic code, compared to e.g. SBCL Common Lisp where idiomatic code can still be fairly performant.
Do you happen to know what the present state of SBCL optimization is? The old CMUCL manual had quite a lot of guidance on writing high performance code. I can't seem to find it though, mostly I remember the bits about what can and can't be open coded and how to give enough basic type hints to help the optimizer out (and using DISASSEMBLE to check that you really did). In fact I think you could even ask for warnings for probably inefficient constructs.
Obviously CMUCL was a mess in a lot of other ways though.
One great thing about sbcl is you can put (declare (optimize (speed 3))) at the top of a function and the compiler will give useful feedback when it can’t optimize something and suggest type hints to improve performance.
I’m not a optimization wizard, but I’ve been able to get some very impressive speed ups this way, in combination with some attention to how my code is written.
Tangentially, I think the future of optimization in CL and similar languages is run time optimization/compilation. For example, the goal should be to make (typical cases of) generic function dispatch as fast as inlined structure field access, even in the presence of the possibility of redefinition of the functions and classes. This is not easy, since these may get redefined while calls to functions using them are on the stack.
Since code is data, some kind of functional persistence might be the right path forward here. I lack the expertise to develop the idea further though. I don't much care about the exact behavior vis-a-vis functions on the stack, so long as it's well-defined.
I would add cryptic error java stacktraces to this list but, as someone who recently overcame the "java is gross" bias, I'd say benefits of JVM and its vast ecosystem outweigh the cost.
> ”Clojure has many interesting modern ideas, but I feel being so tied to Java and the JVM has hurt a bit in the long run.”
Would you expand on this? From my perspective, it actually opens up Clojure to a lot of opportunities where introducing, say, Common Lisp or Racket, would be a much harder lift. Clojure’s just a library, and immediately has available the entire Java ecosystem. Granted, it’s based only on my experience, but I’ve much more frequently seen first-class (provided by a vendor or otherwise officially sanctioned) Java libraries for services and other tooling than Common Lisp or Racket. Being able to take something like that off the shelf for integration is a big plus in my experience.
For Common Lisp, there's ABCL ( https://abcl.org ) which is a CL on the JVM and, otherwise, LispWorks bundles a nice Java FFI.
But, additionally, CL makes binding to C libraries really easy via CFFI and things like cl-autowrap ( https://github.com/rpav/cl-autowrap ) so, when there isn't a native lisp library, it's often easy enough to just use a C library in your program (or, I suppose, anything that can compile to something compatible with C like `extern C` in C++ and the equivalent constructs in Rust)
It would be nice if ABCL's JDK 11 support were brought into the main repo. Speaking of which, it would be nice if the main repo moved away from SVN to, e.g., gitlab. And, while we're at it, the issue tracker could be brought into the new millennium.
LispWorks and Allegro CL are commercially available Lisps since 30+ years, running on various platforms and still supported. The LispWorks 7.1 update from 2017 brought native ARM64 support (Linux, iOS), remote debugging of Android and iOS LispWorks, ...
You aren't tied to JVM with Clojure, you can target JS too. Outside browsers people are using ClojureScript for mobile apps, web app backends (eg with serverless), and local scripting.
There's also the CLR version that some people are using with Unity for game dev.
But I think JVM has really good prospects, even if the trend is currently favouring JS. Straddling both JS and JVM platforms is a pretty good position for Clojure.
There are great benefits to being able to leverage the JVM and NPM libraries. A lot of it happens indirectly, many libs are available that provide integrations to JVM/JS libs under the hood but present you with a Clojurey interface.
Julia? It depends on what you're looking for, but IMO apart from the non-lispy surface syntax (which Dylan also had), it's very much a Lisp with a focus on multiple dispatch.
There’s some interesting work going on in the Guile ecosystem: the new JIT compiler is just on the horizon, and nicely timed with Guix maturing as a platform. It’s lacking some real basic tooling (including an officially blessed package manager), but that gives its small community a chance to learn from the experience of earlier Lisps (including the first few years of Guile) and build something worthy of being the GNU projects’ language of choice.
As someone who's using guile every day, I am very happy about my language of choice. The problem is that I program for fun and rarely have a deadline to meet. Guix is amazing, and I will try to run it on my next laptop
> Common Lisp still has some activity, mainly coming from SBCL, but is a bit stagnant. It has a fantastic literature and many mature implementations.
There are a number of improvements I’d like to see to the core language, but in general I think Common Lisp is more ‘complete’ than it is ‘stagnant.’ I.e., it enables one to write software which is performant, dynamic, able to handle errors as they occur &c.
I recommend it over Scheme in large part because it is so complete, over Racket because it has multiple implementations and over Clojure because I like that it’s multi-paradigm. But tastes differ, and that’s okay.
I like that Common Lisp isn’t afflicted with flavour-of-the-month syndrome. There are libraries for Lisp which have been around, essentially untouched, for years and which still work correctly. Done right, the first time — what a concept!
Odds are high that any will be a fun investment. Common Lisp is interesting for just how much it already has. In particular, I'm finding many of the things that supposedly aren't great about it, to be quite productive if you embrace it. LOOP and FORMAT, in particular.
I've never used Common Lisp, I know some Clojure, used emacs-lisp, tried Racket, Fennel. I love Lisps. But I always wondered: What happened to Common Lisp? Some of the opinions I heard:
- CL is too big (compared to Racket and Clojure)
- Lisp-1 vs Lisp-2
- recursion discouraged (is that even true?)
- There is a thing called "Lisp Curse"
And then seemingly language's popularity decreased. Let's ignore the fact that I really like Lisps, aside that, are there any compelling reasons for an average Joe programmer like me to start learning Common Lisp in 2019? Honest question.
There is and has always been a lot of FUD around Lisp, and Common Lisp in particular. You even mentioned a few examples. My explanation on why this is the case is that learning Common Lisp really takes some time and dedication, but our lazy brains try hard to avoid that and find silly excuses instead (Too many parentheses!).
> are there any compelling reasons for an average Joe programmer like me to start learning Common Lisp in 2019?
If you manage to use Common Lisp in a project, you get an enormous boost to productivity. It is the best language for 'getting stuff done' that I know of. Even if you cannot use Common Lisp directly, knowing it will stop you from reinventing a lot of wheels. Greenspun's tenth rule is real.
I just don’t get the hate on large languages. It’s not like someone is going to look down on you just because you’re not using the whole feature set.
Encountering functionality you’re not familiarized with is pretty much the norm in all projects and I’d rather encounter something that is part of the core language (bonus point if it’s a standard) than a self rolled abstraction or some other library.
It is no coincidence that I always end up working with ugly and so called kitchen sink languages and prefer Perl over Python, C++ over C, Common Lisp over Scheme, Rust over Go and OCaml over SML.
I like the idea of designing and using languages with a small set of orthogonal abstractions but looking back at the whole Scheme fragmentation and my experience with past and current projects written in Forth, C and Go that started fine but turned pretty fast into maintainability hell I think it’s utopian and futile.
I’m not ashamed to say that I like ugly and messy and that I encourage those in doubt to dive in and enjoy the kludge that Common Lisp is.
CL might have been considered big, decades ago, but it's positively lean and mean today. I can build SBCL from scratch on my desktop in 90 seconds. Clang takes much longer to build.
> Let's ignore the fact that I really like Lisps, aside that, are there any compelling reasons for an average Joe programmer like me to start learning Common Lisp in 2019?
One of the things that stands out for SBCL is its optimizing native code compiler. For a dynamic Lisp-like language implementation it provides outstanding amounts of help: useful error messages, warnings, optimization hints, static checks, etc. One throws large amounts of codes on it and it returns with even more messages... ;-)
Language is not big compared to python or something.
Technically it is a lisp-n not a lisp 2, doesn’t matter in a practical sense.
Optimization of tail recursion isn’t in the spec but is done in some compilers. Loops are encouraged, as is mutation, objects. It’s truly multi-paradigm.
It is very powerful, so in some ways it ruins you when you have to go back to writing python code, adhering to ‘PEP 8’ Standards, and people looking over your shoulder at code reviews.
Common Lisp has the potential to be freewheeling and exploratory in a way that other languages still do not.
I think of something, therefore I write it and it exists. There isn’t a lot of hemming and hawing about ceremony and rigor.
I like that feature of Common Lisp... it is my own patch of the Wild West, to some extent.
A lot of people think that if something isn’t difficult and tedious, it isn’t really software engineering.
They may be right, but I prefer whatever this non-engineering discipline is to what I’m doing now.
> - CL is too big (compared to Racket and Clojure)
The old criticisms of Lisp being too big were compared to Scheme & C, and it is definitely much larger than those two are: why, it defines modules & hash tables and has objects! Racket is actually a pretty neat system which is basically Scheme-plus-a-lot; I wouldn’t be surprised if it’s actually larger than CL.
I don’t know how Clojure would compare, but I imagine that it’s comparable.
Not that compared to language like Go or Python, Lisp is very much not big. It doesn’t have built-in email, or HTTP and that kind of thing (but there are good libraries available for almost everything these days).
> - Lisp-1 vs Lisp-2
Most language these days have a single namespace. Honestly, though, I think that’s a mistake along the same lines as most languages these days lacking a symbol type. https://www.dreamsongs.com/Separation.html is a nice overview of the arguments in both directions.
I like being able to easily add my own new namespaces to the language.
> - recursion discouraged (is that even true?)
I don’t believe it is. Recursion is just one more tool in the toolbox. Note that while the standard doesn’t mandate tail call optimisation, many implementations implement it anyway.
> - There is a thing called "Lisp Curse"
I think that applies to any Lisp-like language, not just Common Lisp. For some reason people’s brains turn off when they see:
(if (foo x)
(bar)
(baz))
instead of:
if foo(x) {
bar()
} else {
baz()
}
> are there any compelling reasons for an average Joe programmer like me to start learning Common Lisp in 2019?
I think that it’s still the best language out there for delivering quality, well-forming software which can run in a dynamic environment and handle dynamic situations. I professionally program in Go — and I enjoy it — but I would rather program in a Lisp implementation which adopted goroutines, channels and interfaces[0]. I would rather use something like an eshell which had a few man-years of work on it than zsh, and I’d rather use something like a next-gen Lisp scsh than sh.
There are some great languages out there: ML, Erlang, Lisp. They are each worth learning, worth using.
0: there are threading and channel libraries already, but I honestly don’t know what interfaces would look like in a dynamic language; I do know that I love them in Go.
You pretty much answered your own question. Common Lisp is the big one. Scheme has more of a niche as an extension language, but Racket is a completely decent development environment and language family.
Clojure is a JVM language with some neat persistent data structures. Personally I find it to be one of the least interesting Lisps, but that's like saying it's not my favorite pizza. It's still quite tasty. Furthermore, I can easily get paid to write it, which is a little harder for the others.
It depends heavenly on what you want to do.
All of them SBCL, Racket and Clojure work platform independent, and have different strengths.
I'd choose it based on type of application.
I enjoyed the slides, and wish the talk video was available.
Now that I am retired, most of my side projects use SBCL, with a little Haskell and other languages. I have great respect for the commercial Franz and LispWorks products but for my projects SBCL works great.
I really enjoyed the history covered in his slides. I have lived through this history since 1982, but unfortunately almost 90% of my paid work has been in other languages. In our present time with great resources like Quicklisp, CL Cookbook, etc. I think that the CL ecosystem and number of deployed projects should be even greater.
General side note: incremental slides should be "normalized" in a print version. Not sure if any software supports this. It would probably afford some manual tagging or grouping.
The slides are a bit tricky to read without the rest of the accompanying talk. I couldn’t find a video with the brief search I had.
One thing the talk touches on is that writing a Lisp (cross-)compiler in Lisp is hard. The slides suggest a few reasons and here are some more.
In CL much of the compilation process requires evaluating CL code. It seems that you are therefore fortunate in using CL: you can just take the code and eval it. But this doesn’t work because (wanting to be portable and supportive of optimising compilers) many objects in CL can’t be inspected. For example the following is valid code:
(funcall #.(lambda (x) (1+ x)) 5)
In the code above one constructs a closure at read time, puts it into the ast and the compiler must take this call to an opaque host-platform closure (which evaluates to itself) and somehow marshal it into something for the target platform.
So to write a CL cross-compiler one must first implement a CL interpreter and this interpreter must have objects with enough useful state that one can correctly marshal closures from the interpreter into compiled objects in the target system, even when those closures close over shared values.
One way to reduce this pain is to restrict the compiler to only compiling a subset of the language, and then requiring that the compiler is written in that subset. One may only compile more complicated programs when the host and target are the same instance of Lisp.
However the “easily compiled portable subset” is probably too small and you will end up with either a long bootstrapping process of increasingly useful compilers or having to implement a lot of emulation of your target platform. An example in the slides is that the result of (byte ...) is implementation defined so you can’t just use the value from the host implementation. Another example is floats. In CL there are 4 float types which are allowed to be equal to one another in certain ways (typical modern implementations have 2 32-bit and 2 64-bit; others might have some being decimal or packed into 63 bits), so one cannot rely on the host’s floats behaving a certain way, so one instead has to emulate the target platform’s float implementation to get reliably portable results. Then again, maybe it is possible (but perhaps a bit painful) to write the compiler without using any floats.
Another bootstrapping difficultly in CL is it’s object system which just makes everything harder, especially if one wants to use objects for lots of the implementation-specific types.
SBCL goes for writing the compiler in a subset of CL and the result is indeed sanely bootstrapable. Other implementations typically require eval and some runtime written in eg C and a slow process of evaluating the improving compiler on itself to bootstrap the compiler. This is easier at first but can lead to difficulties (some of which are described in the talk).
> the compiler must take this call to an opaque host-platform closure (which evaluates to itself) and somehow marshal it into something for the target platform.
If we're cross-compiling, we are almost certainly doing file compilation whereby we have to externalize the compilation product to be transported to the target where it is loaded.
If we use ANSI Lisp file compilation as our source of requirements, then we only have to handle, as source code literals, objects which are "externalizable". Closures are not.
See 3.2.4 Literal Objects in Compiled Files
Of course, you can adopt externalizable closures as a requirement in your own compiler project, if you like.
Not sure what you intend to mean, but for a CL file compiler, that's not valid code. No CL file compiler needs to be able to externalize a function object.
Common Lisp still has some activity, mainly coming from SBCL, but is a bit stagnant. It has a fantastic literature and many mature implementations.
Scheme is great, but a bit too fragmented. Racket seems to be gaining some momentum. The merge with Chez may be the tipping point to attract a critical mass of developers.
Clojure has many interesting modern ideas, but I feel being so tied to Java and the JVM has hurt a bit in the long run.
I miss a bit of innovation in the typed lisp area, like e.g. Qi or Shen which never took off. Carp [1] looks nice.
[1] https://github.com/carp-lang/Carp