Hacker News new | past | comments | ask | show | jobs | submit | _19qg's comments login

> However, the point of all this discussion is that Lisp doesn't provide good enough language servers for modern editors,

They use the "language server" model since a long time, before Microsoft's LSP existed. Thus the need isn't really there, unless one wants to develop with Microsoft products (or similar) and get the needed extensions for Lisp into those Microsoft driven standards. If there were a real need and a real benefit, there would be some better adoption on the Lisp side.


> That's exactly what I get out of the IDEs I've used for decades. This whole argument is so weird, disingenuous, and full of vague strawmen. Just because I don't see the value in investing the time that you have doesn't mean that I'm for giving up control or whatever you were trying frame me as.

I used IDEs also all the time. Makes sense to me.

> Anyway, I'm done here. This site is full of toxic people who are offended that someone doesn't choose the same editor and habits that they do, and any dissent is to be punished.

Yeah, that's weird. You can use all kinds of editors you like. In the case of Lisp, there are arguments that good support for editing Lisp and for interaction with a Lisp runtime is useful (since Lisp is often used interactively). But several IDEs and editors can do that.

Just wanted to give you the impression that are others, who like IDEs and similar tools. ;-)


That has been done in the past and was never widely supported. But the landscape is now slightly different. One approach might be to use ASDF (the most popular systems management tool), which then would need to be upgraded to use versions (and ideally have it integrated into a repository mechanism -> Git or similar... Also what are versions in a decentralized world?). ASDF operations then would need to be able deal with version numbers.

A Lisp image would then know/track which versions of what code it has loaded/compiled/... Version information would be stored with the other source files.


> That has been done in the past

Could you share some examples of how the old systems did this?


Can't say what and Interlisp did, but the MIT Lisp Machine and here the commercial variant Genera.

Files are versioned. The Lisp machine file system has versions (same for DEC VMS and a few other file systems). Nowadays the file version is attached to the name&type. Thus editing a file and saving it creates a new file with the same name&type and an updated version number. I can edit/load/compile/open/... files by specifying its version number. If there is no version number specified, the newest file version is used.

A site shared a namespace server with system registry and several file servers.

Code, documentation, etc. is defined as a system, similar to what ASDF does. A system has major and minor version numbers and other attributes (like being experimental or being released). Each time one does a full compile on a system, its major version gets updated and it tracks which file versions it used for that compilation. Whenever one creates a patch (a change of the sources) to a system, the minor version gets updated. After 12 full compiles and 30 patches to the latest version we have system FOOBAR 12.30. This data is recorded in the system directory, thus outside of an image (called a World in Genera) in the file system.

Now I can use that system FOOBAR in another Lisp and "Load System FOOBAR :version Newest" -> This will load system FOOBAR 12.30 into its runtime. Then new patches may appear in the system. I can then in a Lisp say "Load Patches FOOBAR" and it will look into the system directory for new patches to the current loaded major version and load those patches one by one. This changes the running Lisp. It then knows that it has loaded, say, FOOBAR 12.45, what files it has loaded and in which files the respective code (functions, ...) are located.

If I want to ship a system to the customer or someone else, I might tag it as released and create a distribution (with sources, documentation or not). The distribution is a file (also possibly a tape, a bunch of floppy disks, a CDROM, or similar). A distribution can contain one or more systems and its files (in several versions). On another Lisp I can later restore the distribution or parts of it. The restored systems then can be loaded for a specific version. For example I can load a released system version and then load patches for it.

A saved image knows which versions of what systems it includes. It also knows which functions sources are where. It may also have corresponding documentation versions loaded.

Later a source versioning system (called VC) was added, but I haven't used that.


There is already confusion. Things are different and the same words (executable, native, image, ...) mean slightly different things.

In the CL world it is usually relatively easy to create an executable. For example in SBCL I would just run SBCL and then save an image and say that it should be an executable. That's basically it. The resulting executable

* is already native compiled, since SBCL always compiles native -> all code is AOT compiled to machine code

* includes all code and data, thus there is nothing special to do, to change code for it -> the code runs without changes

* includes all the SBCL tools (compiler, repl, disassembler, code loader, ...) thus it can be used to develop with it -> the code can be still "dynamic" for further changes

* it starts fast

Thus I don't need a special VM or special tool to create an executable and/or AOT compiled code. It's built-in in SBCL.

The first drawback: the resulting executable is as large as the original SBCL was plus any additional code.

But for many use cases that's what we want: a fast starting Lisp, which includes everything precompiled.

Now it gets messy:

In the real world (TM) things might be more complicated:

* we want the executable to be smaller

* we want to get rid of debug information

* we need to include libraries written in other languages

* we want faster / more efficient execution at runtime

* we need to deliver the Lisp code&data as a shared library

* we need an executable with tuned garbage collector or without GC

* the delivery structure can be more complex (-> macOS application bundles for multiple architectures)

* we want to deliver for platforms which provide restrictions (-> iOS/Apple for example doesn't let us include a native code compiler in the executable, if we want to ship it via the Appstore)

* we want the code&data be delivered for an embedded application

That's in the CL world usually called delivery -> creating an delivered application that can be shipped to the customer (whoever that is).

This was (and is) typically where commercial CL implementations (nowadays Allegro CL and LispWorks) have extensive tooling for. A delivered LispWorks application may start at around 7MB size, depending on the platform. But there are also special capabilities of ECL (Embeddable Common Lisp). Additionally there were (and still are) specialized CL implementations, embedded in applications or which are used as a special purpose compiler. For example some of the PTC Creo CAD systems use their own CL implementation (based on a ancestor implementation of ECL), run several million lines of Lisp code and expose it to the user as an extension language.


> not to include type declarations because they can lead to a messy middle ground?

What? Type declarations in CL (which came from prior Lisp dialects) were added, so that optimizing Lisp compilers can use those to create fast machine code on typical CPUs (various CISC and RISC processors). Several optimizing compilers have been written, taking advantage of that feature. The compiler of SBCL would be an example. SBCL (and CMUCL before that) also uses type declarations as assertions. So, both the SBCL runtime and the SBCL compiler use type declarations.

> why then it can't be 'hosted' like Clojure?

ABCL does not exist?

https://abcl.org


> ABCL does not exist?

I've only played with Clojure (not used it professionally, I'm working with Scala) but Clojure interop with Java is way better than what I can see here: https://abcl.org/doc/abcl-user.html The way it's integrated with the host platform makes it better for most use cases IMHO.


> The way it's integrated with the host platform makes it better for most use cases IMHO.

That may be. ABCL is running on the host system and can reuse it, but it aims to be a full implementation of Common Lisp, not a blend of a subset of Lisp plus the host runtime. For example one would expect the full Common Lisp numerics.

One of its purposes is to be able to run portable Common Lisp code on the JVM. Like Maxima or like bootstrapping the SBCL system.

There is a bit more about the interop in the repository and in the manual:

https://abcl.org/releases/1.9.2/abcl-1.9.2.pdf


I didn't say "type declarations can lead to a messy middle ground in Common Lisp" - obviously they exist there for a reason, but, maybe they DON'T exist in Clojure, also for good reasons, no?

ABCL does exist, sure, and there's also LCL for Lua. Yet, 8 out of 10 developers today, for whatever reasons would probably use Fennel to write Lispy-code to target Lua and probably more devs would choose Clojure (not ABCL) to target JVM. That doesn't make either Fennel nor Clojure "far superior" than Common Lisp and vice-versa.


> I didn't say "type declarations can lead to a messy middle ground in Common Lisp" - obviously they exist there for a reason, but, maybe they DON'T exist in Clojure, also for good reasons, no?

What were those reasons?

> ABCL does exist, sure,

Would that count as a hosted implementation?


> Lots of us hated doing assembly language programming but had no real alternative.

I kind of fail to see Lisp as an alternative to assembler on mid 80s micros.

Though, there were several cheap Lisps for PCs...


The bank switched memory architectures were basically unused in mid 80s micros (C128, CoCo3, etc.).

Lots of utility software like spell checkers and the like still existed. These would be trivial to implement in Lisp but are really annoying in assembler.

Lisp would have been really good relative to BASIC interpreters at the time--especially since you could have tokenized the atoms. It also would have freed people from line numbers. Linked lists work well on these kinds of machines. 64K is solid for a Lisp if you own the whole machine. You can run over a bank of 16K of memory for GC in about 50 milliseconds or so on those architectures.

Had one of the Lisperati evangelized Lisp on micros, the world would look very different. Alas, they were off charging a gazillion bucks to government contracts.

However, to be fair, only Hejlsberg had the correct insights from putting Pascal on the Nascom.


> Lisp would have been really good relative to BASIC interpreters at the time

I see no evidence for that. Lisp was a pain on tiny machines with bad user interface.

> 64K is solid for a Lisp if you own the whole machine.

I had a Lisp on an Apple II. It was a useless toy. I was using UCSD Pascal and Modula 2 on it. Much better.

I had Cambridge Lisp on an Atari with 68k CPU. It was next to unusable due to frequent crashes on calling FFI functions.

The first good Lisp implementation I got was MacScheme on the Mac and then the breakthrough was Macintosh Common Lisp from Coral Software.

> Had one of the Lisperati evangelized Lisp on micros

There were articles for example in the Byte magazine. Lisp simply was a bad fit to tiny machines. Lisp wasn't very efficient for small memory. Maybe with lots of work implementing a tiny Lisp in assembler. But who would have paid for it? People need to eat. The tiny Lisp for the Apple II was not usable, due to the lack of useful programming environment.

> Alas, they were off charging a gazillion bucks to government contracts.

At least there were people willing to pay for it.


> There were articles for example in the Byte magazine.

And they were stupid. Even "good" Lisp references didn't cover the important things like hashes and arrays. Everybody covered the recursive crap over and over and over ad nauseam while people who actually used Lisp almost always sidestepped those parts of the language.

> I had a Lisp on an Apple II. It was a useless toy. I was using UCSD Pascal and Modula 2 on it. Much better.

And yet UCSD Pascal was using a P-machine. So, the problem was the implementation and not the concept. Which was exactly my point.

> At least there were people willing to pay for it.

Temporarily. But then it died when the big money went away and left Lisp all but dead. All the while all the people using languages on those "toys" kept right on going.


> And yet UCSD Pascal was using a P-machine. So, the problem was the implementation and not the concept. Which was exactly my point.

My point is that implementations don't come from nothing. You can't just demand them to be there. They have to be invented/implemented/improved/... Companies at that time did not invest any money in micro implementations of Lisp. I also believe that there was a reason for that: it would have been mostly useless.

> Temporarily. But then it died when the big money went away and left Lisp all but dead. All the while all the people using languages on those "toys" kept right on going.

Lot's of toys and languages for them died.


> The OP has a good criticism of why this is a bad idea. It's an old idea, mostly from LISP land, where early systems saved the whole LISP environment state. Source control? What's that?

Symbolics Genera can save (incremental and complete) images (-> "Worlds"). The image tracks all the sources loaded into it. The sources/files/docs/... of the software is stored on a central (or local) file server.

I can for example start an initial world and load it with all the wanted software in the various versions I want. Maybe I save a new world from that.

I can also start an pre-loaded world and incrementally update the software: write patches, create new minor/major versions, load patches and updates from the central server, install updates from distributions, ... Maybe save new worlds.

The "System Construction Tool" tracks what code is loaded in what version from where.


actually, according to the LOOP syntax, the REPEAT clause has to follow the FOR clause...


Just a silly example, but it does work on SBCL at least.


A bunch of things work in implementations, while but are not standard conforming.


> if he was using Clojure he wouldn't be having the problems with nconc that he talks about"

Yeah, one would write the implementation in Java.

Common Lisp (and Lisp in general) often aspires to be written in itself, efficiently. Thus it has all the operations, which a hosted language may get from the imperative/mutable/object-oriented language underneath. That's why CL implementations may have type declarations, type inference, various optimizations, stack allocation, TCO and other features - directly in the language implementation. See for example the SBCL manual. https://sbcl.org/manual/index.html

For example the SBCL implementation is largely written in itself, whereas Clojure runs on top of a virtual machine written in a few zillion lines of C/C++ and Java. Even the core compiler is written in 10KLOC of Java code. https://github.com/clojure/clojure/blob/master/src/jvm/cloju...

Where the SBCL compiler is largely written Common Lisp, incl. the machine code backends for various platforms. https://github.com/sbcl/sbcl/tree/master/src/compiler

The original Clojure developer made the conscious decision to inherit the JIT compiler from the JVM, write the Clojure compiler in Java and reuse the JVM in general -> this reuses a lot of technology maintained by others and makes integration into the Java ecosystem easier.

The language implementations differ: Lots of CL + C and Assembler compared to a much smaller amount of Clojure with lots of Java and C/C++.

CL has for a reason a lot of low-level, mutable and imperative features. It was designed for that, so that people code write efficient software largely in Lisp itself.


... I remember meeting Rich Hickey at conference when he'd seen a tweet where I'd favorably compared Clojure to Scala. I had a hard time explaining to him, however, how a professor who wrote FORTRAN programs to simulate the behavior of materials (other academics would be quite shocked when he'd explain that we actually could figure out that iron is magnetic from first principles... I regret missing out on his class on density functional theory as much as I regret not taking a compilers class) told me that a 2x difference in performance mattered so much when you were running big jobs in computers. Thus you weren't going to get everybody impressed with the power of immutability. I am writing a chess engine in Java right now and completely in tune with, in the long term, not making any objects at all in the inner loop or the next loop out.

But yeah, CL was one of the first languages specified by adults and set the standard for other specs like Java that read cleanly and don't have the strange circularity that you notice in K&R. So many people have this abstract view that a specification should be written without implementation in mind, but really the people behind CL weren't going to be implementable and clearly they'd given up on the "Lisp machine" idea and made something that the mainstream 32 bit machine could handle efficiently. It's quite beautiful and had a larger impact on the industry than most people admit.

(I think how C is transitional between failures like PL/I and modern languages like CL and Java that can be specced out in a way that works consistently)


With paredit in GNU Emacs:

1) place the cursor on the left parenthesis of the form

2) type paredit-wrap-round: M-(

3) type: if flag

Doesn't appear to be overly complex.


With built-in emacs functionality you can do:

1. Place the cursor on the left parenthesis of the form (same as you wrote)

2. C-M-Space to select the form.

3. M-( to surrond with parenthesis.

4. type "if flag" (same as you wrote for #3)

One extra step, but no need for a plugin.

Also, I added a simple "insert-quotes" that I think I mostly copy-pasted from the built-in "insert-parentheses":

  (defun insert-quotes (&optional arg)
    (interactive "P")
    (insert-pair arg ?\" ?\"))
So I can replace my #3 above to wrap something with quotes instead of parentheses.


> So I can replace my #3 above to wrap something with quotes instead of parentheses.

That's M-" .


Interesting, I did not know that! But, the one I wrote mimics how insert-parentheses works, so I will probably stick to my version.


Typing "(if flag", and then C-Right to slurp the call in is also intuitive and short.

What truly confuses me is how the PP claims they "went in completely" and "did this for a few months", yet they failed to learn the basics.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: