Hacker News new | past | comments | ask | show | jobs | submit login
The Nature of Lisp (defmacro.org)
202 points by llambda on Nov 9, 2012 | hide | past | favorite | 77 comments



Part of the problem is that lisp evangelism sets itself up to fail. An instantaneous blinding moment of enlightenment, would you like fries with that? Haven't they heard that you shouldn't start a joke with "This is the most hilarious thing ever"?

I've been doing lisp for several years now. I've built several interpreters. I've never had the enlightenment he describes. The minor epiphanies have been on par with oo design and unit tests. I've travelled far over the months, but it's closer to grok than zen.


Enlightenment epiphanies result in proselytizing. Can't be helped, its like tapping your knee with a rubber mallet makes your leg kick out.

The 'secret' or the thing that most people don't get early on when programming, is that code is data and data is code. A binary tree is data that is carefully surrounded by the semantics of the data's relationship with its peers. Reading the structure reads out the data in sorted order. Lisp just makes that painfully clear, that there is no distinction between state and semantics as far as computers are concerned and it allows you to move the 'computation' between data structures and algorithm at any point.

A grad student at USC explained it well when he described it like learning your third or fourth spoken language, Suddenly you brain "flips" from having three or four different ways of naming 'milk' into a single concept of milk with an infinite number of ways to identify it. The relationship between the root concept and the expression of that concept change precedence in your thought process.

Once you have made that switch you can write code in any computer language.


Very nicely put. imho, it could serve as the tl;dr for the op, which was also excellent.


Nice to hear I'm not the only one. I've been using Lisp (mainly Common Lisp) for quite a few years now but never had that flash of enlightenment either. Discovering Lisp always seemed more like coming home: "Ah, this is how I always thought what programming was supposed to be like!"

No fighting with the compiler or being limited by what the PL designer thought you should do, just a pretty direct path from thought to code.


Reading the related article on writing a Lisp interpreter in Haskell (http://news.ycombinator.com/item?id=4764088) reminded me of my second blinding moment of enlightenment- understanding vau expressions. Things that can't be implemented as functions are typically things that require controlling the evaluation of arguments (conditionals, assignment, short-circuiting boolean operators, etc.), and additional language features (built-in special forms or macros for writing your own) are included to handle those. But if you have something that allows you to control the evaluation of arguments, simply choosing to evaluate all your arguments gives the equivalent of a function. Implement that thing, and your compiler/interpreter no longer needs to know about the difference between functions and macros and built-in forms; they're all the same thing!

There's not a lot of practical use for that kind of thing that I am aware of (implementing run-time macros is one, being able to pass short-circuiting boolean operators to map, reduce, etc. is another), but I strongly suspect that's just because we don't have 30 years of collective experience figuring out all of the great things about vau expressions like we have with Lisp and anonymous functions. The only language (discounting toy projects) I know of that actually implements them is Kernel (http://web.cs.wpi.edu/~jshutt/kernel.html).


A language that is lazy by default lets you control the evaluation of arguments. Ordinarily, they're not evaluated, and if you force them they are.

However, macros are not just about whether to evaluate -- but about exposing the internal syntactic structure of the arguments.

In Haskell, using laziness you can implement control flow, short-circuiting, etc. If you want functions that work with the syntactic structure of their arguments, you need heavier machinery:

* Thick DSLs: Define explicit AST types and have a DSL that explicitly constructs those AST's.

* Template Haskell (the arguments' syntax has to be in ordinary Haskell)

* Quasiquotes (Need to parse strings)

I think the need for exposed syntax is relatively rare (e.g: a function that evaluates and shows a trace of the evaluation). In those cases, I think explicit AST types work pretty well, as Haskell has extremely light-weight syntax for constructing user-defined data types.


Without access vocal inflection, I'm not sure if you're intending to argue, or expand. So, I'm gonna go with continuing to expand on the point.

Simple laziness does not allow you the same level of control over evaluation as vau expressions do. A vau expression can choose to evaluate it its arguments exactly once (like call-by-value), exactly as many times as they're used (like call-by-name), only if they are used (like laziness), as many times as you feel like, in a different environment than the calling context, or not at all, and can make that decision independently for every argument.

In Kernel's implementation at least, unevaluated operands are AST types that can be poked and modified, not opaque values like lazily-evaluated operands. As a result, vau expressions can be used to implement macros, both the hygenic and non-hygenic variety, and the language need not define quoting or quasiquoting because those features can also be implemented within vau expressions.

Vau expressions seem to play havoc with static analysis, though, so there are good arguments for actually having some of those things as built-in language features rather than just building everything as a standard library.


I was expanding (with a slight correction about macros doing more than just controlling evaluation).

Haskell-style laziness comes with purity, where it does not matter much whether something is evaluated once or many times. It does matter if it is evaluated 0 or more though (due to non-termination and exceptions).

The opacity of values is what I meant by macros also exposing the syntax as opposed to just controlling evaluation.


It would seem to me that some variety of Lisp would be the ideal candidate as a sort of runs-everywhere language, a thin portable base language that runs on top of different runtimes, offering easy integration with whichever it is running on.

Basically, something like a minimalist Clojure but not just for Java. It would be able to run atop the CLR, JavaScript or the Objective-C runtime as well. The interface with the host platform may be different, as long as the core language works everywhere. Ideally the core would be tiny.


So something like Clojure, but that runs on the CLR and Javascript as well?

CLR - https://github.com/richhickey/clojure-clr JS - https://github.com/clojure/clojurescript Python - https://github.com/halgari/clojure-py Lua - https://github.com/raph-amiard/clojurescript-lua C - https://github.com/schani/clojurec

(Sorry, couldn't resist.)



Forth.

Constructing the basic machine is trivial, then the rest just comes along with it.


Though Forth is only really elegant, when implemented in assembly. Forth requires to generate code on the fly. For something like the JVM, this means to implement your own bytecode interpreter or hijack the class loader. Neither solution comes close to the simplicity of writing some machine code into memory and jump into it.


You more or less just described Kernel Lisp - http://www.lambdassociates.org/blog/nextlisp(3).htm.


Clojurescript compiles seamlessly to javascript, which has become almost universal. It provides a beautiful, battle-ready lisp.


scheme


Very good article, though I doubt it'll convince the usual mass of unbelievers. (I love Lisp, for the record, though my primary exposure has been through Emacs Lisp - so shoot me).

A really great book that helps you get appreciate the concepts in Lisp, without really talking about Lisp directly too much, is "Patterns of Software" by Peter Gabriel. http://amzn.to/TxDKGG

I found it to be a very enlightening read. Definitely a book you have to sink into with plenty of time and quiet.


And a good book for folks who already know how to program and want to learn Lisp in some depth is "Practical Common Lisp" by Peter Seibel. http://www.gigamonkeys.com/book/


Richard Gabriel. Though Peter Gabriel would be good too :)


Haha, oops! Too late to edit. Good catch. : )


It would be weird to have Peter Gabriel write about software. My favorite book of his is http://books.google.com/books/about/Introduction_to_Algebrai...


Patterns of Software is available as a free PDF directly from Gabriel's website: http://dreamsongs.net/Files/PatternsOfSoftware.pdf


For whatever it's worth, this article actually convinced me to take plunge when I first encountered it.


>>the concepts in Lisp, without really talking about Lisp directly too much

The book with the same feature is "ANTLR definitive guide".

When I read it I was like: "Ha! It sounds like LISP!", "Ha! It sounds like FORTH!", "Ha! It sounds like Prolog!"

And Terence Parr mentioned none of them.

Amazing!


Please don't use link shorteners, especially ones which don't allow me to edit your link to remove the kickback if that's what I want to do.


...a bit offtopic, but I was wondering while reading the example of using C itself as the C preprocessor language: why don't languages provide the ability to do this kind of thing automagically, I mean marking some code to be executed at compile time and act as a code generation feature? (I know, it's easy enough to write a small preprocessor that does it, and it's just primitive string based macros, but having a standard way to do it baked into the building tools or the interpreter for an interpreted language seems ...neat ...even cool if some more "magic sauce" would be added to it to make these "macros" hygienic :) ).


> why don't languages provide the ability to do this kind of thing automagically, I mean marking some code to be executed at compile time and act as a code generation feature?

There's certainly already languages that do this type of thing. Haskell has Template Haskell which lets you execute Haskell code at compile time to generate code. I'm pretty sure multiple ML's also have similar meta-programming features.

It works rather nice, actually.


Scala 2.10 will have it too.

There is a distinction to be made. In non-homoiconic languages writing macros takes a lot of effort, while in Lisp it's very natural.

On the other hand I don't feel that's an advantage for Lisp, because macros are not composable as functions are and you have to really grok Lisp in order to write macros effectively and also recognize instances where they are appropriate.


I don't think writing Template Haskell macros takes a lot of effort. It is probably harder than Lisp macros, but the main effort is studying the TH API once.


To understand Lisp is to understand interpreters. With that understanding you can create domain specific languages which is extremely powerful.

But I wouldn't recommend using Lisp itself.. macros in particular are unhygienic.


Common Lisp != Lisp. You mean Common Lisp, Lisp is the family of languages (which also includes the Scheme sub-family, Racket, Clojure, Arc, Kernel…).


You are right, but I disagree. I almost always call "Common Lisp" "Lisp." Scheme is Scheme, Clojure is Clojure, etc etc. I don't care about the family vs language distinction. I think it hurts Common Lisp's adoption. I'd sooner call Common Lisp, Scheme, Clojure, etc part of the "Lisp family" instead of just "Lisp," and leave "Lisp" to mean "Common Lisp."


Its disingenuous to call scheme and racket lisps. They have parenthesis and first class functions but the similarities stop there.

So yes I equate Lisp with common lisp. I didn't read the entire article (far too long) but he does mention 'defmacro' which is in Common Lisp.


So, homoiconicity is a trifling, meaningless similarity?

"Sure, it may be homoiconic, use prefix notation, have first-class functions (in additional to all the other usual functional paradigms that aren't unique to lisps) but it's not a lisp."

Big ok to that one. This must be pedantry of the highest caliber, not ignorance.


They are not homoiconic. The underlying datastructure for many schemes, and racket, is not a list. It is a syntax object. Of course you can still do metaprogramming with syntax objects but I wouldn't call it the same thing.


You should be careful here, and not lump together "many Schemes" and "Racket" (or other specific Scheme implementations). The thing is that Scheme standards have traditionally avoided tying the language with a macro system that requires some specific representation for syntax -- giving you only the simple rewrite rules system means that you don't actually need to know that representation.

In Racket, OTOH, there are definitely syntax objects with enough functionality to write code that handles them, and I suspect that you know that. The question is whether this should be considered "homoiconic" or not, but this is a kind of a subjective issue, since at an extreme, I can say that all languages that have strings are homoiconic. Perhaps you need more from the language to make it so, maybe eval, or maybe actually require it to have compile-time procedural macros? In any case, Racket will have all of the features that CL does, so it is arguably at least "as homoiconic" as CL is. But in fact, it has more than just s-expressions: these syntax objects are basically sexprs + a bunch of stuff like source location and lexical context, so in fact they represent more than what lists in CL do. Should I then conclude that Racket is more homoiconic than CL? And this is not a tongue-in-cheek argument: in fact, many CL implementations are aware of the limits of sexprs as good representation for code, and add things like source location via a backdoor, like a hash table that maps pair objects to additional properties. Racket does that in its basic syntax representation so IMO it's fine to indeed consider it more homoiconic. And I also say that for the addition of lexical context information -- that's something that is not only included in the Racket syntax object, it's something that you just cannot get in CL, so if homoiconicity is being able to have a high-level representation of code (unlike raw strings), then this is another point where Racket wins the pissing context.

Finally, it's not that all "many Schemes" are limited as described above -- there are many of them that have their own macro systems with similar syntax values, and that includes Schemes that follow R6RS since that dictates syntax-case which comes with them. It just happens that Racket is has been traditionally running at the front lines, so it's more advanced.


It's not really necessary to second you, but I'd like to add that "code as data" is more real in Racket than is CL since code is not just the AST, it's also (as you point out) location and more importantly, context. In this setting Racket' syntax objects are more "code as data" than "code as sexp" as it is in CL will ever be.


Right. Perhaps a better way to summarize this is that:

* Lisp made the first giant step of having code representable as data for meta-programming, and chose sexprs to do so

* Common Lisp came later, and made the important step of requiring this representation, which means that in every CL implementation you're required to have the code as data aspect

* But the flip side of this is that CL hard-wires just sexprs, it forbids an extended type, which means that you can't get anything more than sexprs (without resorting to "extra properties" hash table tricks)

* Meanwhile, Scheme (R5 and others that have only `syntax-rules') took a step back by specifying only rewrite rules which can be implemented in any way an implementation chooses

* But some Scheme implementations did use sexprs, but since they need to encode more information (lexical context) they extended them into syntax values (note that some Scheme low-level macro systems try to present users with a simplified interface where user code sees just the sexprs)

* Later on, Racket took further steps and enriched its syntax values with "more stuff"

* R6RS got closer to this too, by adopting the syntax-case system (but some people had issues with "wrapping" symbols, since you can't do that with the hash table trick)

* And finally, R7RS (the "small" version) is going to take a step back into the R5RS days. (And in the "big" language it looks like they'll adopt one of these systems that try to keep the sexpr illusion.)


Homoiconicity doesn't refer to lists, but to syntax being represented in the data structures of the language.


I am admittedly still a Lisp (et. al.) rookie, but isn't the entire point of Scheme that it introduces hygienic macros? Or are you referring to some other (perhaps sarcastic) notion of macro hygiene?


That's one design characteristic of Scheme, but I wouldn't call it "the entire point".


You can do cool stuff with unhygienic macros, however, like anaphoric macros. Interested readers should check out On Lisp by Paul Graham, as well as Let Over Lambda by Doug Hoyte.



[Most of] Let Over Lambda as free HTML: http://letoverlambda.com/textmode.cl/guest/toc

Graham's On Lisp as free PDF: http://lib.store.yahoo.net/lib/paulgraham/onlisp.pdf



Also, this is a very interesting article on why lisp is unsuccessful in the "read world": http://www.winestockwebdesign.com/Essays/Lisp_Curse.html

I like the original article a lot, but what it failed to do for me is convince me why someone like me, a typical programmer, would want to choose Lisp over Python/Ruby/etc to solve a real world problem. Both Ruby and Python have powerful meta-programming abilities built into them. Lisp should be compared with these, not with C.

I still think that functional programming is extremely interesting (I'm in the long process of learning Haskell myself) and is useful is certain real world cases, but I was not convinced by this article. All the problems there are easily solved in modern and dynamic languages.


I found that a rather good introduction to code as data, but I am not sure whether I am supposed to have been hit by the enlightenment he describes… :-)


Yes, for me the first time I read about the lisp syntax I was thinking:

"oh cool, it makes (+ 2 2) exactly equivalent to the syntaxic tree

     +

  /     \

 2       2
"

But I don't find it particularly enlightening and I still don't see what cool stuff you can do with macros that you can't do elsewhere.


Syntactic abstraction usually requires a language change. For example, Python's "with" statement.

In languages with macros, you don't need to wait for anyone to change the language because the entirety of the language is constructed from a few special operators, and you have the ability to continue constructing.

Like Python, Clojure has a "with-open" macro. If Rich hadn't already added it, you could build it yourself:

    ; Very simplified version
    (defmacro with-open [bindings & body]
     `(let ~bindings
        (try
          ~@body
          (finally (.close ~(first bindings))))))
Thus this:

    (with-open [in (reader "/usr/share/dict/words")]
      (count (line-seq in)))
Expands into this:

    (clojure.core/let [in (reader "/usr/share/dict/words")]
      (try
        (count (line-seq in))
        (finally (.close in))))


In Haskell, there's the "bracket" function with generalizes the "with" concept.

  withFile fileName mode = bracket (openFile fileName mode) hClose
Then:

  withFile "/usr/share/dict/words" ReadMode $ \h -> do
    contents <- hGetContents h
    count (lines contents)
No need for macros for this. Just passing anonymous code blocks easily.

Interestingly, the type of withFile, after its given the filename and filemode args is:

  (Handle -> IO a) -> IO a
Which is the type of a CPS'd computation. CPS'd computations are called the Cont monad in Haskell, which is defined as:

  data Cont r a = Cont ((a -> r) -> r)
So the above type of withFile can be written as:

  Cont (IO a) Handle
And if we have, for example, multiple resources we're bracketing over, we can represent them as multiple Cont values. Then we can monadically compose them, which is equivalent to Python's "nested" function (Except we also have type safety).


My comment intended to show how macros can allow one to create syntactic abstraction. That one can accomplish X without creating new syntactic abstraction, or that some language already has syntactic abstraction for X, is wholly irrelevant.


Well, I'm always looking for problems that are uniquely well solved by macros.

Lisps pay a dear price to have macros. If their capabilities are covered otherwise (in ways that are not as costly), why have them?


Macros extend the power of the language way beyond its core primitives. For instance, I wrote a macro, TEMPORARY-ASSIGN. I use it like this:

  (TEMPORARY-ASSIGN ((traversing obj) true)
     ... do stuff ...)
In Python, the equivalent code would be.

   old_trav = obj.traversing
   obj.traversing = True
   try:
      ... do stuff...
   finally:
      obj.traversing = old_trav
There's no way to abstract out that pattern in Python. Every time you want to temporarily assign a field or variable, you're stuck writing the above code. Another example:

   (defun foo (x y) ...)
is how you define a function in Common Lisp. I wrote a macro, DEFUN-CACHE

   (defun-cache foo (x y) ...)
which is the cached version. In Python, you can do the same with decorators, but that's one more tacked-on feature. Lisp programmers have been writing defun-cache since 40 years.

If you want to learn more, Paul Graham's On Lisp is the definitive book on the topic. You can download it for free http://www.paulgraham.com/onlisp.html, and it's very readable, even if you're not a Lisper.


> There's no way to abstract out that pattern in Python.

I'm sure there are macros that can't be abstracted out in Python but this isn't one of them:

    from contextlib import contextmanager
    
    @contextmanager
    def temp_assign(obj, attr, val):
        old_val = getattr(obj, attr)
        setattr(obj, attr, val)
        yield
        setattr(obj, attr, old_val)
    
    class X:
        pass
    
    x = X()
    x.a = 1
    with temp_assign(x, "a", 2):
       print x.a # prints 2
    print x.a # prints 1


I think it's an extraordinary strength of Python that I hadn't seen your code when writing mine but that other than two variable names they're identical. Leaving my comment up for demonstration of this.


Ha, awesome! I went for an exact transliteration although if I were to use this idea for real I would probably do the assignment explicitly in the body. I think this looks a bit more pythonic:

    @contextmanager
    def restoring(obj, attr):
        old_val = getattr(obj, attr)
        yield
        setattr(obj, attr, old_val)
    
    x.a = 1    
    with restoring(x, "a"):
       print x.a
       x.a = 2
       print x.a
    print x.a


Cool, I didn't know that could be done. Can it work for global and local variables as well?


You can do anything you want with a context manager, it's just Python. IIRC, they were first added to the language to get rid of boilerplate while acquiring/releasing locks to make multi-threading easier.


I think I don't understand what your macro does, because the Python code is easily abstractable with a context manager:

  from contextlib import contextmanager

  @contextmanager
  def traverse(obj, attr, val):
    cached = getattr(obj, attr)
    setattr(obj, attr, val)
    yield
    setattr(obj, attr, cached)

  with traverse(obj, "traversing", True):
    # ... do stuff ...
Could you explain what your example does that I've missed?


You're right that his example is possible in Python, but that's only because with happens to be part of the language. If it were not, you couldn't write my_with in Python code alone.

Meaning that the next time you need a feature that doesn't exist in Python, you can't add it.

You should cut these examples some slack; in reality, it's going to be hard to come up with a five line Python example that's ugly, because Python is quite a nice language, and most rough edges have been sanded down over the last 20 years.

That doesn't mean the techniques aren't useful in real world programs, like when you need to build a DSL - just that they're hard to explain in a dozen line HN comment.


If you've ever used C#, imaging being able to implement LINQ in pure C# (as in, it's not part of the language, but the language itself gives you the ability to add it with the exact same syntax as it exists as part of the language). That is what macros give you. You can extend the language's syntax to your liking.

If you haven't used LINQ, then I'd have a hard time thinking of another example, since most languages have fairly uniform syntax.


Although LINQ is basically pure syntatic sugar. I actually prefer the method syntax as it's consistent with the rest of the language. Extension methods are all that is needed for it.

http://msdn.microsoft.com/en-us/library/bb397947.aspx


Seems like Slava Akhmechet's worldwide celebration.


Total ignoramus here:Does this "profound enlightenment" actually lead to profound execution?


As I've written in another comment I have never experienced the profound enlightenment but what do you mean with "profound execution"?

I can tell you that Common Lisp gets me the quickest results going from idea to prototype, it is a very practical language and doesn't get in the way. However a large part of this is experience. It was the most fun language to learn and apply to projects for me though.


This is my third time reading this article; this time I stopped reading after a few paragraphs, but still skimmed it to refresh some things in memory. This is very good article, and one I would recommend to anyone to read, were it not for it's length - these days I guess half of the responses would be "tl;dr", sadly.

It's one of the articles that convinced me to take a look at Lisp a few years back, among others, which caused me to learn Scheme rather than Common Lisp or Emacs Lisp (I think Clojure was not around then yet). I invested half a year time to learn PLT Scheme/Racket and felt enlightened quite a few times along the way. First class continuations were the most mind blowing thing and I spent a few weeks trying to understand them. To prove to myself that I know what call/cc (or rather - it's delimited brethren) is all about I wrote python-style generators using them and this was one of the most rewarding experiences in programming for me.

Then I moved on, to Erlang IIRC, which was much easier to understand and use after being exposed to Scheme. In the following years I learned many more languages, all the while aiming for "purity" of the concepts and knowing full well that I won't be able to use any of them in real world. Many programmers would call Smalltalk a toy language - at best - but I had great time learning it and expanding my views on OOP, for example. I thought that the compromises that widely used languages make cause these languages to represent only a piece of what is possible, even if they are called "multi-paradigm", and wanted to explore more.

All this time I was writing commercial software in Python; I can't say if other languages I learned made me a better programmer - from the business perspective - but some really helped me expand my understanding of what I do. Forth and Lisp and Smalltalk did this and I was perfectly happy with stopping to use any of them after gaining some "enlightenment". They were not practical, not made for real world, they were there just to prove and demonstrate some kind of point, perspective.

This past week I couldn't work due to health problems and suddenly, after a few years of almost continuous work, I found myself bored. I thought, hell, why not? and went to reimplement a tiny bit of what I was working on earlier. I did this using Racket, my first "esoteric" language, so I had quite some things to relearn (good thing, too, because the language evolved in the meantime), but I finally (8 hours or so, in one go... they tell me it's not healthy to do this when you're ill, but it was fun) did it.

And it worked. And looked great. It was much shorter, more elegant and performant than Python. Certainly, half (or more) of this improvement came from me implementing the same thing the second time; but still, I was amazed at how easy and fun that was.

So the next day I decided to create another piece of code in Racket, this time a fresh one, which output would go straight into the larger system at work. It's good I had a task at hand which could be broken into pieces that small. And again, it worked, I did it in about the same time I would do this in Python, despite the lack of "concurrent.futures" or even thread-safe queue in Racket. I didn't use continuations or any other obscure features; just higher order functions and a few macros here and there to simplify error handling and such and some conveniences over pairs and lists.

I'm not sure what should I think about this situation. It's not a "proof of suitability" for the real world, of course - I'd need to write much more code to even begin to be able to claim that Racket is ok to use at work. But on the other hand I felt bad for ignoring really good language and environment for such a long time. I should have been trying to use it more and more often and I didn't because I thought it's not meant for that.

But above all, it was fun. Not because I was learning new stuff, like the first time, but because the language made it fun. And, what's almost as important, it worked - I have the code that does what it should be doing.

Well, I plan to try using Racket much more often from now on... Maybe someone else will give some Lisp a chance after reading this :)


I passionately hate XML so this could not possibly resonate with me.

I never had the enlightenment he talks about. Actually I think that learning Lisp/Scheme might have made me a bit of a worse programmer in a way. It made me "dread" repetitive code so much to the point that I almost could not do anything with any language that's not highly dynamic.

Anyways.

I had 2 epiphanies with lisp.

1. Macros. Very powerful concept, but in practice difficult to use properly in your code. It's too difficult to reason about what's going on, like say, if you're maintaining or modifying a set of macros. I think it's more useful not as a construct that you would often use in your own code, but as a construct that's very useful for making libraries.

2. Continuations. This is not really related to lisp itself, and can be done in other languages, like javascript[0]. Understanding a continuation as an even higher level construct than closures .. and the fact that scheme had it built-in was very mind blowing for me.

It makes sense though that a lisp language must have it built-in. It's a concept that's very fundamental to the theory of computation, but in most programming languages it's not explicit at all.

Before continuations, I thought no lisp language can ever have equivalents of "break", "return", or "continue". After understanding continuations, I see that these constructs can built using continuations as a basic building block.

So this to me suggests that the concept of "continuation" is a very basic and fundamental concept that all students of Computer Science should be familiar with. Unfortunately I was never taught about it in University.

[0]: https://github.com/laverdet/node-fibers


That "in practice" makes it sound like macros are so hard to understand that they're not worth using in real applications, which is definitely not true. Between the facts that (a) one uses them in deliberately restricted ways, (b) one gets increasingly familiar with them, and (c) they are are, token for token, way more powerful than ordinary code, macros end up being used a lot.


Certainly having experience with Lisp/Scheme will make it easier to deal with Macros.

As a "newbie" to Scheme, (well, actually what I played with was Arc, but I think it belongs to the Scheme family) I was able to write a few macros, and seeing them work in action was very nice indeed.

The tricky part is maintaining the macros or changing their behavior. It's like that saying goes: if you write code as cleverly as you can, you are not smart enough to debug it, because debugging is twice as hard.


Interestingly, Oleg Kiselyov recently argued for not including full, undelimited continuations in new languages - http://lambda-the-ultimate.org/node/4586


That's not a problem though: delimited continuations are more expressive than undelimited ones anyway and often are a more natural way to solve programming problems.


Re: macros, this is why tool support for macros is important. Many good Lisps come with macro debuggers that let you reason about the macro expansion. A good example is Racket's macro stepper: http://www.ccs.neu.edu/racket/pubs/cf-sp09.pdf


1. Macros. The Common Lisp version has this problems, but the Racket guys have hygenic macros, which are much better. I have probably not fully understood them, but the most important thing is imho to use lexical binding even inside macros.

2. Continuations. Higher level than closures? The real understand of continuations comes from the implementation imho. You implement your stack frames as a garbage collected tree data structure, instead of the C-way of a memory blob.


Replacing XML with YAML will make it much more clear and much more shorter.

Concept of bindings (of symbols to values) and lexical scope (frames of the environment) must be described.

DSLs must be introduced to show how a list structure and uniform function application syntax glue everything together.

The much better advice - read SICP for Christ's sake.) People who wrote it spend much more time thinking what ideas to illustrate, in which order and why.

Then watch the Lectures, to feel the bliss.)

The true piece of mind comes after you finish reading On Lisp and then the contents of arc3.tar

Before that it is still just as being blinded and puzzled by a sudden flash of premature enlightenment.)


Graham's On Lisp as free PDF: http://lib.store.yahoo.net/lib/paulgraham/onlisp.pdf

Structure and Interpretation of Computer Programs as free HTML: http://mitpress.mit.edu/sicp/full-text/book/book.html



Yes, XML reminds the terrible XSL experience; YAML would be pythonesque.

1. To the beginners, we need to explain; why code/data unity opens up broad possibilities and separation is simplistic. Otherwise some people claim that the best LISP DSL you ll write, will end up separating your data from your code, and tell you the virtues of von Neumann architecture.

2. Also the macro expansion time, and run time separation for non-interpreted LISP seems to be a restriction; mainly if all macros are to be defined at design time and expanded at macro expansion time, the advantages of macros seem to be limited against languages without macros. Namely, macros seem to be a way to modularize code by generalizing, and simpler languages may do it with text editing and module/source code organization features.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: