Hmm- gonna get flamed here, no matter what I say, so... I'll try to be a bit delicate, but not going to put too much effort into delicacy.
I understand loving CL. CL was, I think, the first programming language I really loved passionately, for itself I loved programming in C passionately before that, but in retrospect it was more a matter of having gazed from afar at the machine for so long, and suddenly realizing that C would bring me much closer to her than BASIC had. C was just an intermediary, though I did not understand that at the time. We have good relations to this day, but I'm afraid I do not love her, and never really did. I just used her to get close to the machine.
I still love the machine, but, as they say, familiarity breeds contempt, and it eventually became apparent that we would be better off at some remove from each other. No judgment- it's just that I happen to be fairly abstract, and the machine likes to pretend to be concrete (on some level I suppose she is, but I'm pretty sure we never dug that deep in our relationship.) I suppose that that was, perhaps not so co-incidentally, right around the time I met CL again.
I'd flirted with CL in University AI classes, even worked with her a bit in a computer vision lab, and I'd always thought she was attractive, if a little off-beat. But then I fell for her, obsessively, and for a while she was all I could think about.
I cherish our time together. I was such a bumpkin when we met, and she challenged me at every turn. "Why," she'd ask, "must a byte have eight bits?" And I'd be forced to admit that she was right-a truly inclusive computing culture ought to accept bytes of all different sizes. I won't get into our discussions of filesystems here.
And I was introduced to her friends. What a lively bunch they were, and what a smart bunch. I remember I once asked her friend Erik for the time and... well, actually he suggested that I might want to start carrying a watch or just go home and kill myself, but then he explained some things about time that I carry with me to this day. He was the most provocative of CL's friends, but many of her other friends were not only brilliant, but perhaps monotonically increasingly old and disappearing.
The great thing about CL is that it has a standard that is hard to change. The worst thing about CL is that it has a standard that is hard to change. For many years the former overcame the latter. I'm inclined to think that that is no longer the case.
CL allows us to write code the way we think about a problem--and then bring life to that way of framing the problem. We can come up with an ideal way of describing a solution and then make a language work that way. I say a language because the target of our code might be C or JavaScript (these days that is more often the case for me than targeting CL itself, cf. 4500 recent lines of Lisp that turns into 8000 lines of terse JavaScript).
Our ability to reason correctly about systems directly corresponds to how complex they are. And I posit that complexity in code is best measured in number of symbols (because lines can be arbitrarily long, and longer symbol names can actually be helpful). So a system that reduces the number of symbols necessary to express a solution increases the size of a solution about which we can successfully reason. Just as computers are a "bicycle for the mind", homoiconicity+macros (of which I posit CL is still the best practical implementation) is a "bicycle for the programmer's mind".
Lisp provides an optimal solution for thinking of programs as nested sequences of arbitrary symbols. Sequences that can be transformed (and must be, for a computer to evaluate them, unless we hand-write machine code!). Common Lisp provides an optimal set of built-in operators for composing operations and transformations on its fundamental data types (atoms/cells/lists/trees). Other languages might provide better implementations of particular paradigms or whatever, but CL is the best language for implementing macros. Other Lisps make macros "safer" and miss the point.
As Vladimir Sedach wrote earlier on Hacker News[1]:
"The entire point of programming is automation. The question that immediately comes to mind after you learn this fact is - why not program a computer to program itself? Macros are a simple mechanism for generating code, in other words, automating programming. Unless your system includes a better mechanism for automating programming (so far, I have not seen any such mechanisms), _not_ having macros means that you basically don't understand _why_ you are writing code.
This is why it is not surprising that most software sucks - a lot of programmers only have a very shallow understanding of why they are programming. Even many hackers just hack because it's fun. So is masturbation.
This is also the reason why functional programming languages ignore macros. The people behind them are not interested in programming automation. Wadler created ML to help automate proofs. The Haskell gang is primarily interested in advancing applied type theory.
Which brings me to my last point: as you probably know, the reputation of the functional programming people as intelligent is not baseless. You don't need macros if you know what you are doing (your domain), and your system is already targeted at your domain. Adding macros to ML will have no impact on its usefulness for building theorem provers. You can't make APL or Matlab better languages for working with arrays by adding macros. But as soon as you need to express new domain concepts in a language that does not natively support them, macros become essential to maintaining good, concise code. This IMO is the largest missing piece in most projects based around domain-driven design."
Unfortunately, those people that only treat programming as a way to pay the bills - with a six figure salary - which would be the vast majority of professional programmers working today, do not want to understand _why_ they are writing code. The commoditization of software engineering that companies like Google [1] enthusiastically support and promote is also directly responsible for the obliteration of the entire field.
In a world where geniuses like Alan Kay are almost unheard of and
Tim Berners-Lee ends up receiving the Turing award (next: Stroustrup/Rob Pike) there really isn't a lot of hope for languages like Common Lisp to proliferate. They're simply too meta and require a lot more from you than just settling for whatever makes one feel good about him/herself in an immediate-rewards sense.
I think you're massively overstating your case when you say "optimal." There is no evidence anything CL provides is optimal according to any rigorous metric. I could very easily claim Scheme is "more optimal" since it's a much more minimal implementation with the same capabilities. Also I think you are mistaken to equate fewer symbols with easier to reason about symbols, e.g. I find it much easier to reason about complex programs when they are written in a strong static type system, which CL lacks. The type system lets me leverage the computer to help me reason about the program, by type checking.
Then the computer is doing the reasoning, not (just) you. Static typing will certainly help you reason about a larger system, I do not dispute that. I am talking about increasing the capability of the largest symbol system you operate on in your mind without relying on the computer. If the meaning-density of the symbols is higher, your effective intelligence goes up.
Some people are just very much drawn to ML and/or lisp. For some people it just maps in a very nice way to how you imagine a solution to a problem.
I don't think it is much more than in other languages, it is just that there is a culture within those communities that promotes: "hey wow, I just did something cool. Look".
As a lisp guy I stare in envy at the cool things people do in ML.
I understand loving CL. CL was, I think, the first programming language I really loved passionately, for itself I loved programming in C passionately before that, but in retrospect it was more a matter of having gazed from afar at the machine for so long, and suddenly realizing that C would bring me much closer to her than BASIC had. C was just an intermediary, though I did not understand that at the time. We have good relations to this day, but I'm afraid I do not love her, and never really did. I just used her to get close to the machine.
I still love the machine, but, as they say, familiarity breeds contempt, and it eventually became apparent that we would be better off at some remove from each other. No judgment- it's just that I happen to be fairly abstract, and the machine likes to pretend to be concrete (on some level I suppose she is, but I'm pretty sure we never dug that deep in our relationship.) I suppose that that was, perhaps not so co-incidentally, right around the time I met CL again.
I'd flirted with CL in University AI classes, even worked with her a bit in a computer vision lab, and I'd always thought she was attractive, if a little off-beat. But then I fell for her, obsessively, and for a while she was all I could think about.
I cherish our time together. I was such a bumpkin when we met, and she challenged me at every turn. "Why," she'd ask, "must a byte have eight bits?" And I'd be forced to admit that she was right-a truly inclusive computing culture ought to accept bytes of all different sizes. I won't get into our discussions of filesystems here.
And I was introduced to her friends. What a lively bunch they were, and what a smart bunch. I remember I once asked her friend Erik for the time and... well, actually he suggested that I might want to start carrying a watch or just go home and kill myself, but then he explained some things about time that I carry with me to this day. He was the most provocative of CL's friends, but many of her other friends were not only brilliant, but perhaps monotonically increasingly old and disappearing.
The great thing about CL is that it has a standard that is hard to change. The worst thing about CL is that it has a standard that is hard to change. For many years the former overcame the latter. I'm inclined to think that that is no longer the case.
So, pretty much, why CL?