Hacker News new | past | comments | ask | show | jobs | submit login
I Miss Lisp (technicat.com)
99 points by DanielRibeiro on Dec 12, 2010 | hide | past | favorite | 66 comments



From the late 80s through the early 2000s Lisp was my language of choice. The post author went through many of its advantages -- mostly true. I even went so far as to write an entire compiler and environment (a several man-year endeavor) to design two different runtime oriented dialects called GOOL and GOAL used in 8 Naughty Dog Crash Bandicoot and Jak & Daxter games. All 30-60 FPS action games -- 40 million copies sold. All written in Lisp! (more or less)

Eventually, it was Ruby that got me off Lisp. It does most of what made Lisp great, and the support factor is huge. There are clean standard libraries for everything, the garbage collector is good (the one in ACL is was so awful and dated -- I last used it in '07 -- you pretty much had to turn it off.

I miss the S expressions and the macros. But you can do most macro like things with a good block expression in Ruby, and you can even do auto-generated macro like functions with strings. It just isn't as elegant as Lisp.

The Ruby class system is a bit baroque, but it's actually better in many ways than CLOS, which suffers from being too dated. In Ruby you can do a lot of very cool things with extend and include. It's just a slightly different way of thinking.

The implementation isn't the fastest, but it's good. The Lisp implementations were all clunky and felt like 80s codebases. Oh wait, they are.

Support. The rate of change in Ruby is 1000x that of Lisp. Some new website has an API -- someone writes a Gem immediately. In Lisp you don't even have a really standard networking library!


The ironic thing is that you were doing it during the 90s, which is possibly the worst time to be a Lisp programmer ever and Lisp's "lost decade." Everything that came out of Symbolics got thrown away (RMS turned out to have been 100% right about that), no one was doing free software libraries except for Haible and Steingold (ok, huge exaggeration), and aside from CL-HTTP everyone completely dropped the ball on the whole web thing (AllegroServe only came out in 2000!).

What are you doing right now with Ruby? I remember a couple of years ago at a CRACL meeting the conversation turned to Naughty Dog, and someone (I think it was Chris Baker) was like "Oh yeah, Andy lives in LA. He's writing books now."


The only programming I'm doing at the exact moment are little hacking projects. But I did a whole bunch of backend "cloud" programming in 2006-2008, and I used Ruby for most of it. This included 90% of the backend for Flektor.com (now deceased -- but successful in that it was cool and we sold it to Fox) and all the backend stuff for Monkey Gods (monkeygods.com). At Flektor I actually started the backend in Lisp. At first ACL. Then they wanted thousands per server -- haha. Then we ported to CMUCL. But there were no libraries, and those we had were buggy. Even the MySql integration was buggy. The last thing you want is a flakey database library.

We switched to Ruby. Sure ActiveRecord was buggy too, but in a totally different way. The bugs didn't crash it, or put weird data into the tables -- they merely generated sloppy or incorrect queries (this was 2006). I could spot these and work around. And there were libraries for a LOT of stuff. Fixing bugs in the libraries was easy too.

Ruby also seems to have been designed as a practical language. There is a bit of distain in the Lisp world (or used to be) for I/O, as if it's dirty. Look at Scheme, it doesn't even have I/O in the spec (or didn't last time I looked). I/O is inherently a side effect, which to the functional thinker is BAD! Well I have news. Writing real programs is all about I/O.


Interestingly (maybe ironically?) it was reading about your success w/ GOOL and GOAL that turned me onto lisp! Are you using ruby for game development?


"I miss lisp" articles are almost as old as Lisp itself.

In fact, I miss the days when nostalgic pieces about Lisp where far more eloquent, and well argued. The pinnacle of Lisp yearning was reached when Dick Grabriel wrote A Critique of Common Lisp, and never since has anyone opined Lisp as rightly.

The decline of sentimental prose about Lisp is largely caused by the standardization of Common Lisp. Its specification at 1500 pages, people never had any time left to learn basic essay writing.

The Lisp community also shares the blame for this deterioration in the expression of melancholy. The comp.lang.lisp newsgroup and its brash, unwelcoming denizens haven't exactly helped people be more wistful for Lisp in their absence.


Hopefully, if we promote the ideas in your post, "Lisp Yearning" will become a recognized literary genre and we will read more of it from young, aspiring authors. Let's just hope Quicklisp and Clojure are not too successful.


c.l.l. is the best newsgroup I've ever used.


...and then some Norwegian guy would flame you so hard you would actually die!


Only if you are stupid enough to flame him back.

And, sadly, he will not flame you anymore.


I couldn't find a date on this essay, but it must have been written around 10 years ago. The open source Common Lisp ecosystem, though still small, is much larger than it was then. (For instance, he doesn't mention SBCL or OpenMCL/Clozure CL in his list of open source implementations, and he claims there aren't any with multithreading, which hasn't been true for years.)

So if he still misses Lisp, he should come back. Common Lisp is still around, and in better shape than it was, and there's also Clojure.


Yes, I remember seeing this at least 6 years ago. People should really date their articles.


Right, SBCL supports both multithreading and networking (and a whole lot of other good things).


I think Clojure addresses much of the author's points. I have greatly enjoyed learning the "Lisp" way to solve problems, while still having the option to use Java's massive ecosystem.

It cannot, however, deal with the biggest (IMO) problem with Lisp: the fact that his list includes "Too Good" and "Too Smart" (and the elitist tone that accompanies them).

It is off-putting to try and read about this apparently amazing language, and have large portions of every (hyperbole) article saying effectively that if people were just { smarter || better || less sheep like } they would use Lisp. Luckily, the Clojure community appears to have avoided this attitude so far.

P.S. I mean no disrespect to the author, and I fully understand that language advocacy often takes a confrontational approach.


"the biggest (IMO) problem with Lisp: the fact that his list includes "Too Good" and "Too Smart" (and the elitist tone that accompanies them).

"It is off-putting to try and read about this apparently amazing language, and have large portions of every (hyperbole) article saying effectively that if people were just { smarter || better || less sheep like } they would use Lisp."

Perhaps you missed the wave of Perl hate that Python and Ruby rode to popularity on. This anti-Perl snobbery hasn't seemed to have hurt them any.

Or how about the snobbery of C users against COBOL and FORTRAN? Once again, C got adopted despite (or perhaps even because) of it.

Or the religious wars over static vs dynamic typing, functional vs imperative, OO vs procedural, and which editor or OS is superior.

This kind of elitism is really endemic in the programming field, is by no means restricted to Lisp, and is probably not very relevant to the lack of adoption of Lisp in the general programming community.

Rather, I'd chalk it up to more of a matter of marketing. Lisp is still viewed as more of a (difficult) academic language, rather than a practical one. It's also rather old by now, and doesn't have the shiny/new sparkle that helped Java and Perl (in their day), and Ruby and Python (more recently). Also, how many killer apps (that a typical programmer cares about) have been written in Lisp? Where is Lisp's Ruby on Rails?


Disclaimer: I miss Lisp too, and have noticed myself wishing I could define new syntactic constructs in Python.

That being said, there is something of a difference between hating Perl and saying that Lisp loses because it's too good and you have to be too smart to understand it. In short, Perl fundamentally sucks in ways that impact your life in big ways on a day-to-day basis. Python does have its problems, and it sucks in some respects, but I would claim that it doesn't reach nearly the level of fundamental suckage on an absolute scale that Perl does.

And, moreover, while Lisp fails to suck in certain ways that Python sucks, it ruins your life in other ways that Python doesn't. For example, I use lots of numpy/scipy/matlab/what-have-you style code. I also use lots of Cairo/GTK. In Python, all the stupid crap to take care of doing that in a Python-idiomatic way has been done already; it's known as pygtk and pycairo, and it's available in every package manager I care about.

In Lisp, all that stupid crap has not been done, or if someone has tried, they have not achieved remotely the level of polish of the Python version. So if I want to use Lisp, I have to spend a pretty big quantity of time becoming the maintainer of the equivalents of pygtk, pycairo, etc., and at least last I checked, the amount of crap and slowdown involved there was much greater than the amount of crap and slowdown that resulted from Python's various sucky bits (slowness, lack of threading, lack of macros, shitty lambdas).

And, moreover, Lisp's lack of syntax is a big win in some kinds of code, but in other kinds of code, it's pretty horrible; for example, with Numpy, I can say

myarray[where(other_array > 0)] = foo(bar(5))

or

myarray[:, baz(bum(4)):baz(bum(5))][bum(baz(6)):, :]

and that's going to be several times more succinct than writing my own slicing code and then having to say 'aref' or equivalent for each node in the syntax tree where there's some array indexing going on. In other words, syntax makes the common case fast for some very key common cases.

So saying that a key part of why Lisp fails is because it's too good and people have to be too smart to understand it is mischaracterizing people who decide to use Python because they don't want to deal with the library problems and lack of common-case syntactic convenience.

(I'm not wedded to being one of the latter people, but I do think that there are some strong points in that direction.)


There is a large distinction between hating on the language and hating on the users. Ruby and Python were both sort of anti-Perl, but it was not usually expressed along the lines of "Perl users are too stupid to write Hello World in our language." It was more like, "Dang, Perl users. How do you manage to get anything done with that rotting fish-head of a language? Check out what we have over here in civilization!"


It was more like, "Dang, Perl users. How do you manage to get anything done with that rotting fish-head of a language? Check out what we have over here in civilization!"

... which is ironic for students of language design who realize that none of (for example) Ruby, Python, and PHP get lexical scoping anything close to correct, that none of them do object systems particularly well, none of them do an extension system particularly well, and only Python comes close to the amount of available and accurate documentation or the robust test coverage of the language and its features.

You can discount a lot of the rest of the criticism as "I didn't bother to learn what makes the language unique before writing a bunch of awful code" and "I don't like sigils but am embarrassed to sound that much a snob."


If you don't think Ruby's is good, what language does have a good object system? I mean, it essentially plagiarized Smalltalk, which has the best fundamental object system I know of — and added mixins, which complement traditional inheritance very well.

(Yes, its imitation of Smalltalk has a few seams showing — like the fact that blocks are not normally objects — but overall it's a pretty good if imperfect system. I certainly don't know of any language that gets closer.)


Ruby does have an advantage (thanks to its Smalltalk influence) of treating most primitives as objects in a much less clunky form than Python. That allows for many useful abstractions.

CLOS, Smalltalk, Perl 6, and Perl 5 with Moose have better object systems. Mixins suffer similar method resolution flaws to multiple inheritance, and the same problems with unconstrained monkeypatching show themselves with unconstrained duck typing in large systems.


How are the other systems better? I've bumped up into enough rough edges with Ruby to have a pretty good idea of where its seams are showing, but I've had to write a fair amount of code to get there and I don't have nearly the experience with the others to spot how they might work around the same design problems.


CLOS and Perl 6 have multiple dispatch, which improves genericity when used well. All four of the languages I mentioned have better metaprogramming capabilities, whether from runtime introspection in Smalltalk to a well-defined MOP in the other three. Smalltalk and both Perls offer better mechanisms for genericity, abstraction, and safety through traits and roles.


I mean, the obvious answer is CLOS, but a lot of people don't think that CLOS is an object system.

(To mean that it is very much different from smalltalk/ruby/java paradigm of single inheritance, encapsulation, and message passing).


I wonder if people think "CLOS is not an object system" because of its pervasive support for multiple dispatch.


IMHO, the biggest reason why Lisp never won out over imperative programming languages is exactly this attitude. Advocates of imperative programming languages took critiques of those languages seriously, while Lisp advocates wouldn't.

Take the evolution of C, for example. First, you had ALGOL, but it had problems. So someone created CPL. But that had problems, so then we had BCPL. Then B. Then C. That’s five separate languages! People on the imperative side of the language divide were willing to throw stuff out and start from scratch, over and over, until they got it right.

Lisp advocates were not.

Fortunately, the creation of Clojure suggests that a willingness to change is finally making its way into some areas of the greater Lisp community.


You seriously think that C is an improvement over Algol?


Are you really saying that Common Lisp and Scheme are that much more similar to primitive Lisps than C is to Algol-68? That isn't my impression at all. I'd say Clojure is to Scheme as C# is to C, roughly speaking.


Since Lisp actually has changed a lot over the years, I guess what you're saying is that they should have changed the name every so often?


you didn't really follow the history of lisp did you? Plz at least go to wikipedia befor you write something.


I am not sure that Lisp not being mainstream is very much to do with Lisp at all, but much rather not being in the right place at the right time.

Would we really be having this conversation if Sun or Microsoft had promoted Lisp?


For the record parent is the only non-stupid post in this entire thread.

If I really wanted to read arguments about why Lisp sucks from people who don't program in Lisp, and complaining about issues that have been solved half a dozen years ago from people who can't be bothered to check a web page, I'd go read comp.lang.lisp archives.

The rest of you can go back to trolling about how Lisp sucks and Clojure will solve all its problems even though you've never used Lisp and have no plans to ever use Clojure.


Yeah, if only Microsoft had promoted Lisp the way they had Xenix, or the Zune, then _everybody_ would be using Lisp! Or if Sun had picked up the cause, then Lisp would be used in all those set-top boxes the way Java is today. Oh, wait...

Microsoft actually did have a diversity approach for their language tools division in the 1980s -- ever heard of MuLISP? The problem was that Lisp on MS-DOS or Windows 3 was not really a good fit for most (any?) personal computer at that time. Certainly not when compared with the likes of Turbo Pascal, say.

It seems to me that programming languages get their moment in the sun if they offer some significant technical advantage for that particular time and market. Perl took off because it did seem better for some things at the time, for example, while the latest versions have seemed (to me, at least) as a desperate cry for attention when other dynamic languages have gotten most of the programmer love.

If there's anything the last dozen years of language adoption have shown us, it is that "promotion" is only part of the equation and it can manifest in either a top-down or bottom-up fashion. (See Python or Ruby for examples of the latter.)


Not trying to sound hostile here, but your comparisons seem a little disingenuous. You're equating Lisp with an ancient operating system and a music player? Lisp is a programming language. Microsoft has other products in the category of programming languages. The ones it has pushed — Visual Basic, C# and (to a lesser degree, since it's NIH) C++ — have all been tremendously successful. This is because Microsoft has a monopoly on Windows dev tools and APIs that it can exploit to get developers using things. This monopoly is less useful for pushing Zunes.

And to try and argue that Java is unsuccessful because it isn't used in set-top boxes? I don't even know what to say to that. The fact that Java's successes were in other domains than the one you hand-picked for your comparison does not somehow make them disappear. You may as well say that C was never very popular because no Ruby on Rails apps were ever written in it.

The fact is that Lisp has not had any major backers since the AI craze fizzled out. If a Lisp had received the same kind of support as Java, C#, Objective-C or JavaScript, the language landscape might very well look quite different.


Well, I'm willing to entertain data that contradicts my thesis, but I'm unconvinced by either attacking my motives or distorting what I wrote.

My point with the Microsoft language products of the 1980s was that they were willing to sell a variety of products if people were willing to buy them. They had Microsoft versions of Fortran, C, and Pascal as well. Xenix was their server solution back then, but that didn't really matter for that state of the market. The mention of the Zune was intended to highlight that the technical merits are dwarfed by right-place-and-time effects. I knew plenty of people who thought J++ was the wave of the future because, you know, that API lock-in is such a determinant for success.

As to Java, I never said that it was unsuccessful. But Sun kept trying to flog Java as an embedded systems solution and it was an exercise in futility, whether it was the original set-top boxes, JINI, or those incredibly useful Java personalization ID rings like the one I have somewhere in a desk drawer. Having a major player pitch still didn't make it popular in embedded markets.

Woulda-coulda-shoulda arguments about Lisp only lacking major backers are arguments based on a lack of data. If major backers are so key, then why were Python and Ruby so successful?


Jini is a set of distributed system building tools, it had nothing to do with embedded systems. For what it's worth the core ideas of Jini are actually really great and robust, the problem is that it's not J2EE. No one wants to rewrite their shit for Jini, they'd rather try to throw the mess at things like Terracotta or bigger servers (for what it's worth, I've never seen a Java application that wasn't a piece of shit, so I can't really blame anyone).

There was some research into applying the services model to mobile and wireless systems, which makes a lot of sense, but it was research as neither the market nor the underlying routing technology has yet to emerge.

Also, Java has been immensely successful in smart cards and cellphones. You are seriously wrong about Java not being used in embedded systems.


Jini has nothing to do with embedded systems? Really? Well, de facto, sure. But how did Sun originally pitch it? Let's do a little search and, oh, here's Bill Venners talking about the same topic in 2006: <http://www.artima.com/weblogs/viewpost.jsp?thread=150666>..., the Summary starts with "Sun's original marketing message that positioned Jini as a technology for devices backfired in 1999..."

Perhaps we have different notions of embedded systems. I'm thinking of self-contained computers/microcontrollers that are part of a larger system/product with a significant incentive to keep hardware costs to a minimum. As in, use the cheapest processor and least amount of RAM as you can. Now that covers a large range, from 8-bit microcontrollers to high-end systems running real time OSs. The last time I was seriously following that field, however, the hands down favorite language was C. I would be greatly surprised if Java had even risen in usage to rival Forth in the embedded domain. If you have some contrasting data to provide, feel free.

I'll give you partial credit for thinking of J2ME in mobile phones. But there's still a significant difference between programming games in J2ME or Flash for a phone and using Jave to program the phones themselves (ie, the underlying embedded system). Why do you think Nokia's recommendation for application builds was either gcc or, for those building their own phone ROMs, the proprietary ARM compiler?


"Jini has nothing to do with embedded systems? Really? Well, de facto, sure. But how did Sun originally pitch it? Let's do a little search and, oh, here's Bill Venners talking about the same topic in 2006: <http://www.artima.com/weblogs/viewpost.jsp?thread=150666>..., the Summary starts with "Sun's original marketing message that positioned Jini as a technology for devices backfired in 1999...""

That link is giving me a Java stack trace, but let's put the quote in context from people who actually worked on promoting Jini:

"Probably the biggest misconception is that it is concerned primarily with devices. This was, unfortunately, the original marketing message used for Jini technology, so this misconception is to a large extent our own fault. It was one of those cases where an illustration of what Jini technology could do -- attach a device and it's found and used, detach the device and it disappears -- was mistakenly thought of as all that the technology could do."

http://java.sun.com/developer/technicalArticles/Interviews/w...

"For an example of the power of a good story, consider the one developed for the Jini project. As described earlier, Jini technology is a thin layer built on top of Java that allows applications to be written as remote services that can be dynamically deployed and located over a network. The obvious story is that Jini is another middleware framework for distributed applications. But this story has a technology focus that would have severely limited the spread of the Jini message--indeed the term middleware causes even developers to yawn. Instead, the Jini marketing team built a story around what Jini technology would mean to users; that story generated lots of excitement in developers, the press, and the marketplace."

http://www.dreamsongs.com/IHE/IHE-67.html

I don't see how that's a message pitching it as an embedded systems solution. Something like network plug and play/Apple Bojour, sure.

"I'll give you partial credit for thinking of J2ME in mobile phones. But there's still a significant difference between programming games in J2ME or Flash for a phone and using Jave to program the phones themselves (ie, the underlying embedded system). Why do you think Nokia's recommendation for application builds was either gcc or, for those building their own phone ROMs, the proprietary ARM compiler?"

Thank you for completely ignoring my mention of smart cards. That's about as embedded as it gets, and Java Card is one of the leading technologies there.

The Nokia example is horrible. Symbian is dead despite having the largest market share - most apps running on Symbian are J2ME. The iPhone might be a better counter-example. But then you have Android, which is a de-facto JVM.

So you have the two biggest categories of consumer devices, cellphones and smart cards, and J2ME runs on over 80% of the former, and I don't know how many smart cards use Java Card, but it seems to be a very significant percentage.

Saying Java hasn't been successful in embedded systems is completely false.


OK, I'll grant you the smart card example. I have no experience with those guy; there are what, a half-dozen vendors? For obvious reasons they don't talk about the details of their software much.

The nub of our disagreement: the definition of an embedded system. Per Wikipedia, "An embedded system is a computer system designed to perform one or a few dedicated functions often with real-time computing constraints."

So a cheap MP3 player counts as an embedded system, because playing audio is what its hardware and software is designed for. A personal computer, in whatever physical form, that runs a variety of applications, including audio players, is not an embedded system. And therefore, no, I don't count smart phone applications as embedded systems software. If you're writing device drivers that are burned into a smart phone's ROM, on the other hand, then I'd consider it embedded software.

When you find hardware running Android that's designed to be dedicated to a single task, such as, say, a climate control system, then you can count it as an embedded system. A tablet PC or an application running on it that sends commands to your existing climate control system, not so much.


"There are free implementations of Common Lisp (GCL, CMU Common Lisp, CLISP), but they came late and incomplete. "

SBCL is open source and free and complete and performs incredibly well with multi-threading etc. I wonder how old this article is?


This is only true for SBCL on Linux, isn't it?


Besides Linux, SBCL supports multithreading on FreeBSD, Mac OS X, and Windows.


Last time I tried SBCL on FreeBSD, the binary package didn't have threads enabled, and when I compiled it from ports, the configuration dialog had me enable the threading manually and said it was "experimental".



I think the author misses the biggest reason why lisp never caught on: too many developers are simply aversive to s-expressions. Maybe Clojure, with its slightly richer syntax, will solve this.


Other languages have absorbed just about all the other key lisp ideas. Garbage collection, first-order functions, dynamic typing etc are all uncontroversial now. I don't think it's an accident that s-exprs have been conspicuously unpopular in comparison.

I actually find the Clojure syntax harder to parse. Nested mixtures of }, ], and ) are very difficult to pick apart visually.


"Nested mixtures of }, ], and ) are very difficult to pick apart visually."

And nested ), ), and ) are easier to pick apart visually?

At least with }, ], and ) you know any given } matches a { and not a ( or a [. Whereas with all )'s, it's a lot harder (for me) to know which ( in a mountain of ('s it's going to match.

A good editor and proper indenting will, of course, mitigate some of the pain. But I'd MUCH rather have a mix of {'s, ['s, and ('s, than all ('s.


At least with }, ], and ) you know any given } matches a { and not a ( or a [. Whereas with all )'s, it's a lot harder (for me) to know which ( in a mountain of ('s it's going to match.

I have found the opposite. It's a lot easier to close with a bunch of )))s at the end of a line than to flip around with the )]}]))} line noise. I let my code's structure give me meaning. I am less concerned with matching things and more concerned with readable code arrangement. It is easier for me to read English-like words than a bunch of symbols. As a result, the ))s are almost equivalent whitespace with the exception that they provide useful and common structure.


paredit.


And nested ), ), and ) are easier to pick apart visually?

You don't have to pick them out in CL, just balance them. Much easier (IMO).


You don't have to pick them out or balance them in any lisp. That's what paredit is for.


So easy that a decent text editor can do the job.


You can't introduce s-exprs into an ordinary language - it would cease to be an ordinary language and become a Lisp dialect instead.


Macros.


What about them?


cageface missed the major reason why Lisp in general is powerful: prefix notation is really the only clean way to implement macros.

The rest is just icing on the cake.


S-exprs == macros


Those two things are not equivalent at all. I don't think you understand what you are talking about.

S-expressions define a class of syntax styles. By themselves, they have no semantic characteristics at all. Macro's are a way in which the compiler rewrites the syntax tree based on user programs. Do you see the difference?


The problem with this argument is that it's one of the key reasons most lispers like lisp and where the language draws many of it's strengths. You're essentially writing an AST, which gives you lots of flexibility and makes things like advanced macros and DSL programming viable solutions to problems. A dislike of s-exps may fall in the "too smart" problem category.


http://weblog.raganwald.com/2007/01/what-ive-learned-from-sa...

The whole "nobody likes s-expressions" is as well-worn and pointless a cliche as "I miss Lisp boo hoo."


Also wanted to plug "Shen" here, an offshoot of the lisp-based Qi language. Shen is VM agnostic, so it will run on Python, Clojure, etc. Javascript is its first target.

Here is the appeal by its creator - Mark Tarver (and Carl Shapiro)

http://www.lambdassociates.org/Shen/appeal.htm or http://news.ycombinator.com/item?id=1921347

The qi syntax was previously posted on HN : http://www.lambdassociates.org/Book/page000.htm


It absolutely needs to be admitted that it is easy to make a well intentioned unmaintainable mess in Lisp and other highly dynamic languages. Brilliant programmer A will add a lot of layers and abstractions that make sense in one context, which is great until brilliant programmer B does so in a different context. This all seems to break down somewhere when the number of programmers exceeds 1.


I've worked on several large Lisp systems, and have never had this problem.

You can make a mess in any language. I will grant that Lisp offers more ways to do that than most languages, but so does C++ (template metaprogramming, anyone?).


Template metaprogramming is awesome. All of the C++ messes I've seen are related to ridiculous OO crap.


Well, my point is that you can make a mess using template metaprogramming. Maybe that doesn't actually happen too often, because anyone smart enough to use TMP at all is smart enough not to make a mess :)

But you're right, the actual messes I've seen in C++ have not involved TMP. Come to think of it, most of them have been around "const" misuse.


What does "add a lot of layers and abstractions that make sense in one context" mean? Is that like "entities and/or actions placed or performed in a certain manner for a particular purpose, or to carry out a certain task"?

Everything in a high-level programming language is an "abstraction". And anything outside the grammar of the language is a convention.


Exactly what I feel. I miss the integrated debugger and trace facilities the most. You can see exactly what your code is doing in lisp. What happened 10 years ago was the dot.com craze and the demand for Java programmers went through the roof. I am now trying to get back into Lisp maybe through Clojure too.


I wish they would fix this in SBCL. The stepper is such a pain to set up and use. Smalltalk has really spoilt me in this regard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: