Hacker News new | past | comments | ask | show | jobs | submit login

R itself could be considered a test of this hypothesis, too. It’s been said that elegant, powerful Lisp would be more widely adopted if it wasn’t for all those gosh-darned parenthesis.

Well, at its core R is a Lisp (specifically, Scheme) but with a more traditional syntax (infixed operators, function calls, etc). And it’s fair to say the adoption of R has, indeed, been more widespread than that of Lisp.




I’m not totally convinced that being ‘secretly a lisp’ is what was good about R. I think the easy vectorisation is good, and the consequences of the bizarre function argument evaluation are good. I don’t know of lisps that do the vectorisation stuff so naturally, and while I guess fexprs are a thing, I think they are possibly too general in the syntax they can accept – basically the simplicity of lisp syntax allows macros to have more tree-structured input in a way you wouldn’t want for a language with non-lisp syntax (where the head lives outside the list), and I think the flexibility makes the syntax more confusingly non-uniform.


A lot of R's popularity and usefulness has to do with the libraries that are available in it. I would put up with almost any amount of BS from base R to use ggplot and the tidyverse, and ditto for a number of modern stats algorithms. In many cases, Python implementations of the same techniques are either woefully outdated or completely nonexistent.


When I’ve seen attempts at ggplot2 or dplyr in other languages, one issue is bad performance or bugs, but it’s also been a problem that the language features seem to allow those libraries to be much more ergonomic. eg I found Julia much less nice to use for those sorts of things despite it seeming light it ought to be well suited (making a reasonably good claim to a lot of CL heritage for example)


Plotnine is a pretty good implementation of ggplot2, apart from the necessity to quote variable names it seemed to work almost perfectly.


I'm not sure I would come to this conclusion. R has some adoption, but it's also really not used as a generic programming language, which most Lisp dialects are.


That has more to do with (poor) performance, not syntax. At it's core, R's source code written in C is still very badly optimized and not performant at all.


As someone who loved learning lisp and regrets that the long course of my programming career has never led me to use it in a professional capacity, I just don't buy it when people say that parentheses are the reason people didn't adopt lisp more widely. I would say the main reasons are:

1) The language is so frikkin massive. Common lisp is a huge language with hundreds and hundreds of built-in functions etc and the standard came very late in its evolution so there is a bunch of back compat cruft and junk that everyone has to live with. The object system is a whole epic journey in itself. You could probably kill or at least seriously injure someone with the impact if they were lying down and you dropped a copy of Guy Steele's excellent book[1] on them from a standing height.

2) The ecosystem is so fragmented. First you have Common Lisp, which isn't very common at all. Then you have all the vendor lisps. Then you have whether they have or don't have clos to contend with. Elisp is a lisp but is not common lisp and differs in some important ways that I don't quite remember. Then there's scheme, and guile scheme (which isn't quite the same) then clojure, etc etc.

3) That meant that the tooling was basically all simultaneously amazing and awful. As an example my uncle wrote a tcp/ip stack in lisp for the symbolics lisp machine[2] for a project when he worked at xerox. He told me in the late 80s about features in the symbolics debugger that just totally blew my mind and are only now available in IDEs for other languages, like being able to step backwards, alter variables, then step forward again, jump to any stack frame and just resume execution from there etc etc. On the other hand he had to write the TCP/IP stack himself because they didn't have one. I think that perfectly encapsulates the lisp experience for me around 2000 when I last used it - some things worked amazingly and were way better than anything else (eg I remember at the time the things you could do with serialization being just extraordinary compared to other languages) but a bunch of basic stuff was painful, janky or just completely missing.

4) Some of the concepts are very powerful but result in programs that are incredibly hard to understand. Macros, continuation passing, multiple dispatch.. etc etc. This puts a lot of people off because they just hit the learning cliff face-first and give up.

This is part of why python saw such wide adoption in my opinion. Not because it was in any sense the best language, but it was a very easy, practical choice for doing a bunch of things.

[1] https://www.cs.cmu.edu/Groups/AI/html/cltl/cltl2.html . Paul Graham (yes that Paul Graham) wrote a good lisp book also, although for me Steele is the one.

[2] https://en.wikipedia.org/wiki/Symbolics


J it's standalone, it doesn't use APL in the background.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: