Lisp just isn't that much better than our current languages. Sorry.
Sure, when Lisp came out it was "advanced". Garbage collection, AST parsing and manipulation, macros, packages, a standard library, modules, homoiconic data structures, etc.
However, the good features of Lisp are now normal features of any modern language. And some of those features have gone into the dustbin for good reasons. And some of them (static vs dynamic typing) are personal taste.
I remember using Lisp in 1984. It was completely eye-opening relative to the BASIC, assembly, and C I was using up to that point. However, it didn't run well on small machines that we plebians were using back then. So, we plebians moved back to assembly, C and possibly Pascal.
Lisp STILL has the same problem on small processors. The embedded world is desperate need of a good language for rapid development, and yet Rust seems to be the only new contender? No offense to Rust, but is that the best we can do?
Where is a good dynamic language that is always online and runs in <8K of flash/RAM?
> However, the good features of Lisp are now normal features of any modern language. And some of those features have gone into the dustbin for good reasons. And some of them (static vs dynamic typing) are personal taste.
I definitely agree that with each passing year, languages that are not recognizable as being related to lisp have more features that mad lisp special in the past.
Features I regularly use that are good, but aren't "normal features of any modern language" (some are implementation details, but common to most CL implementations)
1 CLOS
2 Homoiconicity
3 A good interactive debugger with incremental compilation
4 Generalized references
5 An equivalent to reader macros
In addition, while some form of AST parsing and manipulation may exist in many modern languages, it is typically much less accessible than in LISP. I think this is largely because of the lack of homoiconicity, but also partly a culture issue.
There are probably other features that I don't use that others make good use of as well.
As a last comment, it baffles me that #3 is something that is extraordinarily rare to find outside of the lisp, smalltalk and forth families, but is regularly mentioned as a huge productivity boost by programmers that use it.
There's actually no reason I can think of that non-pure statically typed languages couldn't have such tooling either, with the restriction that global variables (including functions) never change type.
I think VM image saving/loading is still debatable as to whether or not it's good, assuming that it's the primary way of generating a binary image, as it is in most lisps.
A lot more work has to go into guaranteeing reproducible builds, and it tends to be a fairly non-compact representation of your executable. If CL were a little bit less customizable at load-time, that would allow for much faster FASL loading, which in turn would make the reliance on saved-images less important.
Obviously there are uses for that feature, and having a feature doesn't by itself make a language worse, but when that feature is a square-peg, but still the best fit for a particular round-hole, it causes problems.
I found it pretty easy to generate images in a reproducible way: just load your packages, run whatever setup you want, then save the image. Compactness hasn't been a problem for me but I suppose it could be. Admittedly I don't have much real experience with deploying Lisp applications in any serious way; I'm curious about what problems people run into.
An interesting case for #3, is that something similar exists in the Ruby world(https://www.youtube.com/watch?v=4hfMUP5iTq8), but I don't think it's gaining wide adoption, there're certainly people using it but they really aren't the majority.
More extreme case lies in the Lua/OpenResty world, they used to have some interactive debugger but since really so few people are using it, the debugger just got abandoned. Everyone is just accustomed to print-based debugging.
BTW: I actually agree that #3 can be quite useful, I'm using pry quite extensively when I'm working on Ruby projects. But I guess different communities might just have different preferences.
I find with good unit tests a debugger is just not necessary. I haven't used a debugger in anger for many years now. In fact, if I run into a problem where I feel that I need a debugger, I refactor the code so that I don't need a debugger any more. In doing that, I usually find my problem ;-)
But you are right. Different strokes for different folks.
Honestly I think unit tests and interactive debuggers are really for different purposes: unit tests are more used to guard existing code that has been written, while a debugger can be more useful tinkering with new libraries before writing anything production ready.
This is especially true for Ruby(and maybe for Clojure?), since in many cases documentations for libraries are not so great, so you will really have to try the library a little bit to get to the results you want.
I know a lot of people in Lisp and Smalltalk communities that use unit tests still use the interactive debugging environment. In fact, many of them write their unit tests in the REPL, and then paste them into their test-code. There are even a couple of unit testing libraries for CL that are specifically designed for doing this.
[edit]
I also recommend that you watch the video xuejie posted (https://www.youtube.com/watch?v=4hfMUP5iTq8), as it's about using a debugger to write code more than using a debugger to fix code.
The basic idea is to have a uniform way of setting values to variables or calling a setter method. That's very helpful for writing macros that need to both read and set a value.
For example, the `+=`-operator found in many languages is implemented as a macro in CL (INCF). It needs to read the value of a place, increment it and then set the new value to the same place.
Doing that with a variable is easy. Using non-lisp syntax
x += 5
expands to
x = x + 5
Doing that with object slots is easy too, if you don't mind accessing the slot directly. However, if you want to have getter and setter methods (which may not even correspond directly to a slot in the object) the expansion would look different.
o.setFoo(o.getFoo() + 5)
or if the getter and setter can have the same name
o.foo(o.foo() + 5)
Since the expansion is different, the macro would have to figure out whether it's dealing with a variable or an accessor method and behave differently. With generalized references you can use the accessor method as if it was a variable
o.foo() // calls the getter
o.foo() = 5 // calls the setter
Now you easily can expand
o.foo() += 5
to
o.foo() = o.foo() + 5
just the same as you would with a variable. Behind the scenes you still have separate getter and setter methods, but the macro doesn't need to worry about that.
I feel like functional lenses have vastly improved upon this idea now by (a) being first class, (b) being completely composable, and (c) being general enough to include notions of access into sum types and having multiple targets.
> Where is a good dynamic language that is always online and runs in <8K of flash/RAM?
Early in my career I wrote code for embedded systems (small autonomous mobile robots) that ran on 8-bit processors with memories around this size. The coding was done in Lisp-like DSLs with compilers written in Lisp. Because the DSLs were Lisp-like, there was no need to write parsers. READ was the parser. And from there writing the compiler itself was pretty easy.
BTW, the Lisp I was using (Macintosh Common Lisp) ran on a Macintosh II with 8 MB of RAM. (This was the late 80's, early 90's.) That was a huge amount of memory back then. Today, not so much. Before that, I ran Coral Common Lisp on a Mac Plus with 1 MB of RAM. I recently ported TinyScheme to run on an STM32. So yes, you can use Lisp in embedded systems.
Sorry, I can't agree with you on this one. Lisp has multiple dispatch - how many languages you know have multiple dispatch? Many Lisp implementations compile to machine code that is very efficient for a dynamically typed language, e.g. SBCL is way faster than other equally dynamic languages like Python. CommonLisp is ANSI standardized. How many languages you use are ANSI standardized? Lisps have full garbage collection that can deal correctly with cyclic structures. Many other languages have only reference counting or some RAII management and falsely advertise them as advantages? How many other languages have metaobject protocols and can be extended arbitrarily? And since you mention it (I don't this really deserves mentioning, though), how many other modern languages are homoiconic?
The list could go on and on. I'm not a big Lisp fan, but you have to give credit where it's due. Most newer languages do not even have half of the features that modern CommonLisp implementations offer. You're right about memory consumption, of course, but using Lisp for embedded devices has always been a stupid idea.
The problem of Lisp is certainly not expressivity, and also not speed, memory consumption, or the syntax. It's not even the dynamic typing as long you use a good unit testing framework. The real problem is that every program becomes a DSL and even seasoned Lisp hackers cannot read other programmers' code. Large Lisp programs too easily become a big hacky mess that nobody but the original inventor understands. And the community aggravates this problem, because they really are nasty, bearded hackers who'd just write their own OOP implementation over the weekend when they aren't in the mood for using one of a hundred already existing ones.
Lisp code is just not maintainable enough. My 2 cents.
Is this based on real world examples (like, Lisp projects that you know have failed because the code has turned into an unmaintainable mess), or just using your intuition to extrapolate along the lines of "uh, this language is very flexible, most people are idiots and use excessive flexibility to hang themselves ergo this language would lead to unmaintainable code"?
...because by the same reasoning you could reach the conclusion that all dynamic languages lead to unmaintainable code, therefore nobody should use them :)
Take some Python lib that does heavy use of reflection and peek at it's source code.
Those features do make code harder to maintain. But with Lisp you not only have more powerful metaprogramming features, you also have a community that embraces and recommends their use, instead of weighting their usefulness against their maintenance cost.
The unfathomable DSL is a problem with undocumented code in very important places, not with LISP. LISP is merely powerful enough that people can have this problem.
You can force the "bearded hackers" to write documentation and not make a mess in any language.
> Sure, when Lisp came out it was "advanced". Garbage collection, AST parsing and manipulation, macros, packages, a standard library, modules, homoiconic data structures, etc.
The big advantage of Lisp is that is has homoiconic code, and that it's easy to work with that code, and that's it's easy to extend that code.
Lisp doesn't actually have a very good module system: but ASDF extends it so that it does. Lisp didn't actually have any OO features, but CLOS extended it so that it does.
That's the advantage of Lisp (and Lisp-like languages) which I've never encountered elsewhere: you can grow and expand the language. You don't have to wait for some committee or gatekeeper: you can do it, right now. Your way may not be the best, but you can do it.
LFE, Lisp Flavored Erlang, is much closer to Lisp, and not just in name. And, LFE is compatible with both Elixir and Erlang. [1]
To quote, Robert Virding, one of Erlang's creators and the creator of LFE:
I would say that one restriction with the type of macros that elixir has, irrespective of whether we call it homoiconic or not, is that they can only manipulate existing forms in the AST, they can't define new syntactic forms. In elixir this means that you basically work with function calls. There is syntactic support for making the function calls look less like function calls but the macros you define are basically function calls.
In Lisp you are free to create completely new syntactic forms. Whether this is a feature of the homoiconicity of Lisp or of Lisp itself is another question as the Lisp syntax is very simple and everything basically has the same structure anyway. Some people say Lisp has no syntax. [2]
Granted. I guess it boils down to degree of 'Lisp-like' when talking macros. Another article on HN today about Racket, Template Haskell and macros. I think the whole idea is that Haskell doesn't really need them, since it has other ways of achieving similar ends thereby making TH macros a bad fit for Haskell.
I like Elixir a lot, but I have the luxury of not depending on coding for a living, so I am learning LFE because I like how it is as much a Lisp as it can be when constrained by the BEAM. Truthfully, I could stay away from all of the distributed languages like Erlang, Elixir, LFE, Pony and others, since I don't really have a use for them (yet - looking at ABM Agent-Based Modeling).
But I keep at LFE by trying to duplicate the book 'The Handbook of Neuroevolution Through Erlang'. There are at least a couple of other people who have started it in LFE and the other in Elixir. A good fit for ANNs.
I agree, it's essentially an inline assembler, but the fact that it integrates so well with SBCL tells us that the SBCL team really did an awesome job here.
Yes, I've used it, and it's quite nice. But it's not quite the same. In Elixir (and of course Erlang), pattern matching is pervasive. You use it when defining functions by defining multiple heads, each with a different patter. You can also use it when binding variables. For example:
> Where is a good dynamic language that is always online and runs in <8K of flash/RAM?
Forth immediately comes to mind with those constraints. 8K is an order of magnitude less storage than what the IBM 704 that lisp was first implemented on around 1960 had.
Forth is very cool and I agree with what you said, but it is arguable how useful it can be although OP didn't specify that he cared about that (he hinted at what would be a better replacement). I think there are a lot of things you'd have to implement from scratch or attempt to find a collection of words for, which might not work on your Forth. So if you're Chuck Moore then you're good. If not, I'm not sure if the extreme advantages (tiny size, REPL, ~speed, high ceiling/low floor) make up for the fact that you're essentially working in a vacuum.
You can squeeze a FORTH kernel into 2K, so 8K really is quite a luxurious size. I actually wrote FORTH code very early in my career and wrote lots of things you might not imagine doing. For example I wrote a 3D star field animation system for a planetarium in FORTH in the late 80's.
I think programmers (especially these days) over estimate how much advantage they get from reusing other packages. Because it becomes so much more difficult to refactor your code when you can't change some interfaces, you may end up with more (or at least more complicated) code than if you wrote something that is tailor made. In languages where you have a lot of facilities available, it is one of the hardest decisions to get right. When you are forced to write everything yourself, then it's easier to get the choice right ;-)
At work we use Clojure, and I find it much better than Ruby, but not because it has more features. Actually because it has fewer features, namely fewer "magic" features which were meant as a convenience, but really end up just making almost all Ruby code very convoluted, e.g. having tons of unnecessary implicit indirection. That plus immutability makes Clojure code much easier to keep clean over a long period of time. And being able to edit s-expressions is actually easier and quicker for me than editing line-based code now that my muscles have memorized paredit.
This is the biggest appeal for me as well. I find that Clojure hits the sweet spot between being simple and flexible.
It has small and consistent syntax coupled with pervasive immutability. I find that when I work with Clojure, I'm rarely thinking about the language itself.
Since the core language is very small, it's un-opinionated. This makes it easy to extend it in the way you need for your particular domain.
Many languages today have the same features and allow you to do everything Clojure does and more, however most of these languages are also far more complex.
I've come to realize that language complexity has a huge impact on productivity and code quality. It's incidental mental overhead that distracts you from the problem you're solving. The more syntax and rules you have the harder it becomes to tell whether the code is doing what you think it's doing.
We also use Clojure at work alongside Ruby, and I'm continually pleasantly surprised to find the benefits of using an immutable-by-default language. I like the functional programming aspects of programming in the large and small, but those advantages pale in comparison to the idea that data should be immutable except when absolutely necessary.
I dabble in Elixir in my spare time, and that language appears to yield similar benefits. Again, immutability is the key, not necessarily the functional focus. While immutability is more difficult to implement in OO languages, I would be curious to see what that looks like.
This is more related to a mindset than a list of features. By the way, "modern" and "new" have different meanings[0]. Even if a good deal of features from Lisp are incorporated in newer languages, what makes Lisp interesting is how everything is designed to bring a cohesive dynamic environment. Trace, debug, macroexpand, change the readtable, change the pretty-printer, change evaluation rules, types, conditions and restarts, special variables, around/before/after CLOS methods, the MOP, etc. all those little things contribute in making programs that can evolve, not in a hacky way but as part of the language philosophy.
I also like the idea that programs should strive to be correct first and fast second, while effectively providing ways to profile and optimize code, tune or deactivate the GC, etc.
Red is a new "full stack" language that is pretty nifty. Think of it as imperative, but still homoiconic like lisp (or Rebol). It comes with a complete binary interpreter that is less than a MB. It makes heavy use of DSL's. In fact, the language is written in a DSL called Red/System that is a native AOT compiled language similar to C, but with Red syntax. Red programs can use pre-compiled binaries from Red/System, use the interpreter for parts of the program without types, and a JIT for the parts where you declare types. All in all it is pretty awesome that it has a cross-compiler targeting Mac, Linux, Windows, Android, FreeBSD, and several other platforms.
Edit: It is still in the "beta"ish phase, but the team is making rapid progress. Although I don't think the systems language itself will replace C, you get one language that can in theory do everything well (web-pages, to AI, to embedded robots, to parsing [see the Parse dialect which is like regular expressions, but readable].
It was never about the features -- it's about their integration. The fact that you can get get 20 ad-hoc features piled on a language doesn't mean they come as naturally as they do in LISP.
And "AST parsing and manipulation", "macros" and "homoiconicity" are very far from being "normal features of any modern language", except if by modern you mean "any language that has them". Most languages currently in vogue (by which I mean extensive commercial/IT use and made in the past 10-20 years, not COBOL or C++) don't have those.
>Lisp STILL has the same problem on small processors.
So? We have languages at all levels of the CPU-power spectrum, plus we use Javascript and Java and Python in small processors all the time, and Lisp is either very near (e.g. to Java) in speed or flies all over them.
Except if you mean really restricted embedded processors, but then again, nobody said LISP is the best option for them.
> Lisp just isn't that much better than our current languages. Sorry.
No need for apologies -- it weakens your argument and makes you sound unsure.
Lisp is that much better than most languages. The isomorphic syntax used to represent both data structures and program code is an oft-touted feature that few languages lay claim to (so-called, homoiconicity). The reason this concept is wonderful is that it allows us to use syntactic substitution with the same rigour as Leibniz and Tony Hoare. We can take any expression, substitute parts of it with other expressions and give it a symbol. And because the rules of substitution in Lisp are well defined it works all the way down.
Where you run into trouble is with the dynamic environment. CL decided it was enough to give programmers a function to generate unique symbols. Others such as Scheme insisted on a hygenic macro system. The difference is that one requires the programmer to be more careful in order to not restrict the forms one can write.
This is why macros are cool.
Other features I miss (that aren't unique to Lisp):
Nokia released and feature phones still being sold under its brand are made with lisp. Ofcourse it's not the whole platform, mainly the UI and some glue components but essentially, all major work was written in lisp with some extensions.
>some of those features have gone into the dustbin for good reasons
I'm curious what features you're thinking of. I know PHP put lexical scope into the dustbin, but languages since then haven't. Looks like the wrong decision in retrospect. Python put multi-statement lambdas in the dustbin but I don't think there's consensus that that was for good reasons.
Lisp just isn't that much better than our current languages. Sorry.
Sure, when Lisp came out it was "advanced". Garbage collection, AST parsing and manipulation, macros, packages, a standard library, modules, homoiconic data structures, etc.
However, the good features of Lisp are now normal features of any modern language. And some of those features have gone into the dustbin for good reasons. And some of them (static vs dynamic typing) are personal taste.
I remember using Lisp in 1984. It was completely eye-opening relative to the BASIC, assembly, and C I was using up to that point. However, it didn't run well on small machines that we plebians were using back then. So, we plebians moved back to assembly, C and possibly Pascal.
Lisp STILL has the same problem on small processors. The embedded world is desperate need of a good language for rapid development, and yet Rust seems to be the only new contender? No offense to Rust, but is that the best we can do?
Where is a good dynamic language that is always online and runs in <8K of flash/RAM?