Hacker News new | past | comments | ask | show | jobs | submit login
In Praise of APL: A Language for Lyrical Programming (1977) (jsoftware.com)
117 points by bladecatcher on Dec 15, 2018 | hide | past | favorite | 57 comments



> I am firmly convinced that APL and LISP are related to each other along an important axis of language design and that acquiring simultaneous expertise in both languages is possible and desirable for the beginning student. Were they unified, the set of tasks that succumb to terse, flexible and expressive descriptions will enlarge enormously without overly increasing the intellectual burden on the student over his initial 16 week contact period.

In the context of setting the objectives for education in computer science for general students, I like the idea that the objectives are to (a) understand the limits and potential of computation and (b) achieve fluency in programming such that one can conceive of and describe computational processes for a wide range of tasks.

He explicitly suggests that fluency in using other people’s programs not be an objective. Perhaps this is a bit idealistic since most programming in the wild today consists of glueing together other people’s programs. But it seems like the right objective when possible—-in some ways it is a view of computer science as a liberal art.


There's another link between Lisp and APL in the article:

"Some years back, we had a visit at Carnegie from a person at MIT whose name I've forgotten. He started to give us a lecture in a little office about some programming issues in LISP. He went up to the blackboard and he spoke LISP. Everything he wanted to describe, he described in terms of parentheses and CONS and CARS and CDRS. He found himself quite capable of expressing his ideas in the language in which he programmed. Not once during the half hour or so that he lectured to us did I see the inevitable block diagram, the flow charts that show up on the blackboard with things written in semi-English. He didn't need them. And at the time I said to myself, "LISP has a very precious character, if indeed there are some people who can express programming ideas to other people in the language in which they program.

"I can't do that with ALGOL; never have I been able to do it with ALGOL. Whenever I've programmed in ALGOL and I've wished to make some statements about the program I was writing, I was forced to go outside the language and use English, or mathematics, or some block diagrams or what-not.

"In APL, I find that to a far greater degree than any other language that I've used, I can make statements about the programs that I'm writing, in APL -- actually not exactly APL, but APL with some nice little extensions that I dream up at the moment but would never think of implementing. But by and large, I find that the language allows me to express myself, in the language, about the things I'm dealing with. I find that a very precious property of a programming language."


> but APL with some nice little extensions that I dream up at the moment but would never think of implementing.

Does this mean APL without extensions is quite limiting?


> Joel Moses has been credited with coining the phrase in the 1970s

> APL is like a beautiful diamond – flawless, beautifully symmetrical. But you can't add anything to it. If you try to glue on another diamond, you don't get a bigger diamond. Lisp is like a ball of mud. Add more and it's still a ball of mud – it still looks like Lisp.

> Moses strongly denies this, claiming he instead called Lisp a bean bag because it always returns to its original shape.


Cannot stop laughing even read them the second time. Great quote.


APL is a read syntax for point-free function chaining in which the operators have one-character names, and may be juxtaposed without intervening whitespace.

Relative to APL, the interesting matters in Lisp tend to lie on the other side of the AST hill.


What language do you believe can live on both sides of the hill? Haskel?


APL's are a special beast. A very interesting article that got me hooked "Ruins of forgotten empires: APL languages"

https://scottlocklin.wordpress.com/2013/07/28/ruins-of-forgo...


Nice one, thanks so much!


Related article posted on HN recently:

APL is more French than English [https://news.ycombinator.com/item?id=18640451]


> But at that time APL was not running on any computer; and he stoutly insisted that it was unnecessary that it ever run on a computer. It was for him a notation with which he could express algorithmic concepts; and for him at that time, that seemed sufficient.

Indeed. Because both APL and Lisp began as languages for describing computational ideas, it stands to reason that they would be particularly fit relative to languages that evolved under other fitness criteria.

I was intrigued to not the opposition to APL mentioned in the article by Djikstra and others on the basis of a different mental model of computation.


Not only is that article related, it's the same article.


In college, we had a class about high-level languages and the teacher decided to go with APL. Many of my colleagues complained they'd never use APL professionally and that some other language, with more ready practical use (we already had FORTRAN and everyone was fluent in BASIC) would be a better choice. They were right, of course, in that they'd never use APL professionally. They were stupendously wrong in that it was a profoundly enlightening experience.

Once you learn APL, you gain a super-compact mathematical notation in which to express computation.

You also learn the value of extensive comments and of avoiding being clever - if you try to be clever, you won't understand your program 5 seconds after having written it.

And, if you are really lucky, you'd have learned what a beam-spring keyboard feels like. :-)


Now that I feel the bite from my sedentary lifestyle on my bum, I'm more and more attracted by the idea of a language taking 1/10th the typing compared to what's typical today. However, afaict languages following APL's suit specialize in math, so I wonder if the approach could be adapted to more general kind of coding.

Even though the set of short symbols has to be limited, don't we currently have a few dozen often-repeated operations in language keywords and standard libraries? (Especially in the approach of e.g. Clojure, relying heavily on combining standard transformations on strictures.)


> However, afaict languages following APL's suit specialize in math, so I wonder if the approach could be adapted to more general kind of coding.

I use k/q regularly, and I'm not using it for "math".

The compact notation creates value in helping you make correct programs. See [1] and [2] specifically.

[1]: https://news.ycombinator.com/item?id=8476294

[2]: https://news.ycombinator.com/item?id=8476702

You can write C in a dense style as well. And I do. When I do this, I can see opportunities for reuse that I cannot see if I spread my C program across multiple pages and multiple files. Here is the bulk of my webserver[3] that will beat the pants off of any other webserver -- substantially faster than nodejs or kdb's own webserver[4], and probably nginx or anything else you've got. (PS: If you think you know of a faster one, I'd like to know about it).

I am telling you I can only do this because the code is small.

[3]: https://github.com/geocar/dash/blob/master/d.c#L63

[4]: https://github.com/geocar/dash#performance


Node.js is not the thing to compare C web servers performance to. Like, it's so much not the thing, it goes beyond funny and wraps around to sad.

Let me refer you to the TechEmpower framework benchmarks: https://www.techempower.com/benchmarks/#section=data-r17&hw=...

Look at the language column there, you'll be surprised.


Here's fasthttp running on my machine (best of three):

    $ wrk -t2 -c90 -d9s http://localhost:8080/plaintext
    Running 9s test @ http://localhost:8080/plaintext
      2 threads and 90 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency   831.78us  364.48us   7.56ms   70.19%
        Req/Sec    40.55k     3.31k   48.04k    74.44%
      726417 requests in 9.01s, 87.98MB read
    Requests/sec:  80603.64
    Transfer/sec:      9.76MB
which I got by checking out https://github.com/TechEmpower/FrameworkBenchmarks.git, disabling the mysql connection, and running frameworks/Go/fasthttp's ./server-mysql (which is what the benchmark script seems to do). I thought this would be easier than getting dash running the TechEmpower results.

and here's dash running with the kdb networking disabled (best of three):

    $ wrk -t2 -c90 -d3s 'http://127.0.0.1:8080/?f=204&k=hi&v=1'
    Running 3s test @ http://127.0.0.1:8080/?f=204&k=hi&v=1
      2 threads and 90 connections
      Thread Stats   Avg      Stdev     Max   +/- Stdev
        Latency   787.72us  213.62us   3.49ms   71.85%
        Req/Sec    44.82k     3.04k   60.44k    83.61%
      271946 requests in 3.10s, 16.08MB read
    Requests/sec:  87671.23
    Transfer/sec:      5.18MB
My laptop isn't a beefy "Dell R440 Xeon Gold + 10 GbE" -- this is just a loopback test, but it's already disinclined me to spend any more time on it; Fasthttp definitely is impressive how close it gets, but dash is still faster.

And comparing a 100 line C program to hundreds or thousands of lines of go or C or Java is a bit pointless. If the 100 lines of C doesn't do what you want, I'll throw it away and write a different 100 lines. That's what brief programs get you.

NB: I would have tried ulib but it wouldn't even build on my laptop.


If you rewrote [3] to use cleaner variable naming, perhaps the readability would improve?


I personally question the utility of such terseness, and much prefer the verbosity of the Lisp family of languages, where cultural norms make for function names like "number-to-string" and "expand-file-name", rather than the norms I've seen in languages like APL, K, J, Q, OCaml, and Haskell, which seem to love more mathematically-inspired single-letter names like "n" or "k", and various operators in a similar vein.

The Lisp-like way of programming is more appealing to me because it makes the programs very easy to read. You can mostly get a sense of what they're doing just by reading them like ordinary English. I've found this especially useful when I'm trying to understand code I'm not already familiar with, or when looking at my own code months or years from when I wrote it, and it's especially important these days when polyglot programming is commonplace -- I don't need to remember nearly as much of how to do something in Lisp because it's so easy and straightforward and doesn't have the overhead of remembering a whole bunch of specialized syntax.

Contrast this with the terser, more mathematically-inclined languages, where specialized syntax and single-character names are widely used. These tend to be more write-only languages for me, where I have to be constantly steeped in the language in order to make understanding it relatively natural, and if I go away from them for a while, it takes quite a bit of effort to get back in to them enough to make sense of what was written, and reading other people's code is much more of a chore than it is for me in Lisp.

In the old days, or perhaps on embedded systems these days, when one had to closely watch the byte count of one's program lest it not fit in to memory, perhaps such terseness was useful. But these days, I'm not yet convinced of its utility outside of a mathematical context, where the mapping of such terse names and operations is more natural.

For me clarity trumps terseness.


For me, terseness is necessary for correctness.

I don't just do this with APLish languages; I do this with C (and lisp, and PHP, and others...)

I've only ever written short programs correctly: If they can fit on a page, I can just look and see whatever bug I might be experiencing.

If someone wants to change my software, it's because they want it to do something that I don't want it to do. They will find value in the fact it is short: That there isn't very much to read. Admittedly, a programmer unexperienced in this method may have some anxieties about it, but given how valuable correctness is, I'd prefer to cause a bit of anxiety in beginners than make programs that need beginners to fix them.


The notation can yield very dense programs. Significant systems have been written that can be expressed on a screen or two of code (search Arthur Whitney APL for "the legends").

When we learn to read, we learn to "sight read" words. APL/J/K constructs can be "sight read" as well.


Sight-reading is a great analogy because as with music, in order to stay fluent and make one's performance (or sight-reading) easy, one has to stay in constant practice.

I'd argue that much less of this is required for languages like Lisp, unless, of course, you step away from practicing reading English, in which case maybe the very English-like programs of Lisp will start to seem foreign to you.


I think you're just bad at reading pointfree notation, tbh, which is not a particularly interesting commentary on OCaml or Haskell.

Lots of things lend themselves well to pointfree style, and if you do not see them it is because of your own horizons.

You can write OCaml/Haskell with names like doesFileExist when you need to.


"You can write OCaml/Haskell with names like doesFileExist when you need to."

You can, but it's rarely done in practice. This is a cultural issue more than a language issue. OCaml/Haskell programmers just seem to prefer much shorter, more math-like names than Lisp programmers do.

For me this makes a huge difference when reading through code written by the community. Lisp is just much more immediately understandable than OCaml/Haskell and related languages.

Yes, this is probably my own limitation, and I'd probably be a lot better at it given enough practice. But I just don't need that much practice to understand Lisp, and I can go away from it and come back to it much later without needing a significant refresher in the language either.


Interestingly, I noted I tended to use one letter variables when writing in Ruby, and the longest, most descriptive possible variable's names when writing in C#; And Ruby, I think I have read somewhere, is somewhat close to lisp. There is the readability of the language itself to weight in too, when its easy to see the important stuff, you can name the important stuff with short names. And there is the most important, what are you writing about; Strangely I always remember to myself that the only important goal is to write the slowest, most natural language, and above all, most natural/daily-life/ELI5 Logic. But always end with the logic looking like pure math notation, like if I looked at it not knowing it was me who wrote, would think is unreadable Dark Magic Math, and I end with it having zero Math or CS formal training (only highschool), so maybe it's inescapable.


APL is actually kind of bad at math, or well the kind of math people tend to think of when they talk about matlab or Fortran being good at math.

What APL does well is filter/select/transform more spreadsheet style work than linear algebra stuff. It's a language to describe computation as meant by the sort of computer scientists that were born when computer was a job description.


Still, from what I gathered, APL is primarily suited for chewing through a bunch of numbers lumped into arrays, in the manner of shaders or DSP, while I haven't seen examples with traditional control flow (and Perlis notes that loops and branches aren't APL's thing)―though I didn't look for long.

Come to think of it, that's similar to what's often done to lists in Lisp, outside of the more traditional control flow of 'business logic.' Precisely the filter-select-transform.


APL has map embedded into language. And reduce is an operator in J. APL is conveniently used by those who don't describe themselves as programmers - let alone as computer scientists - but need to get the job done. In this aspect it's similar to Excel, I think.

I wonder if APL is that much different from Matlab by the concept.


Syntactically, Matlab is closer to traditional languages. Where it excels at math is that you can do all sorts of things like solving matrices automatically. APL can invert small matrices, but can't automatically find the best way to solve it like Matlab will do. APL has great mathematical notation, but is missing the math libraries you typically need to do real math work (optimization solvers) which Matlab has via toolboxes and Mathematica has builtin.


FORTRAN is a great language, but it's not really good at math either. It's used with floats, which are a peculiar and volatile beast, though certainly useful. When I think of "good at math" I think of something like Egison.


Most high performance computing was done with Fortran back in the day (and still used today) because working with arrays and matrices is done at a higher level than C/C++ or Assembly, but it is still a fast language and has easy access to BLAS/LAPACK. When I think of scientific computing, I think of Fortran, Matlab, Mathematica, C, C++, Python + Numpy, and recently Julia. APL could've been great here if the vendors had included low level code to do all the numeric work and used APL as the glue language, but it didn't happen that way and became popular in the finance world instead. It's a shame.


You may want to check out the Co-dfns compiler, which is proof positive that APL is a terrific language for doing things like tree manipulations, which is the traditional domain of Lisp and functional programming languages like Haskell.

https://github.com/Co-dfns/Co-dfns https://news.ycombinator.com/item?id=13565743 https://news.ycombinator.com/item?id=13638086 https://news.ycombinator.com/item?id=13797797


I hear of APL being used in high-frequency trading and other financial spaces. It's not only because of the language's expressive power, but because the whole runtime fits well within the CPU's instruction cache and, therefore, your program will be ridiculously fast.


"Above all, remember what we must provide is a pou sto to last the student for 40 years, not a handbook for tomorrow’s employment."

At least around here, this strikes a nerve. Especially 40 years later. All of the colleges in town prep for the local market of tech rather than the fundamentals that may be advantageous overall and potentially financially more so out of market. I know stellar remote and small / mid size shop engineers in this part of the south, but by and large the schools are completely myopic about this and perpetuate the microcosm of low paid enterprise focused sweatshop cubework. I realize to paint it with such language is hyperbole but it's certainly not the exception to the rule.


I can imagine that a fork of GNU APL, re-written in FORTRAN with multi-threaded and maybe distributed computing support, would be an amazing mathematical tool. Something that's super fast and scales really well, and makes mathematical operations really easy.


I think KDB might be what you’re looking for. It’s very efficient and widely used in finance. It’s pretty expensive, but I believe you can use it for personal use for free (there might be a single core restriction I think).

As an aside, KDB devs make bank, like salaries >500k


I've always been curious how people get into this. I've been interested in KDB ever since I read http://archive.vector.org.uk/art10501320, but it's not exactly a language they teach in college or you'd pick up in independent study. Do people just apply to fintech places with relevant experience and then get training for kdb?


Re how to get started: what was effective for me was to pair kdb+/q with another language and constrain how much I tried to do with q itself. q-sql for joining, filtering, pivoting and grouping (particularly date grouping and grouping with custom aggregate functions) is incredibly fast and powerful and also easy to learn. (though certainly, q experts will be able to point out all sorts of sub-optimal things in your queries)

Initially, we used F# to shovel data into kdb and to orchestrate queries and process results -- while keeping our q-sql simplistic -- and still saw astonishing speedups in our data processing. Over time, wherever we needed more speed, we'd do more and more in q itself (the 'Q Tips' book is very helpful).

One can get a lot of power with just a little kdb+/q.


Typically, one would start off in a quant trading desk which uses kdb+ for their analytics or join First Derivatives [0] that trains graduates

[0] https://www.firstderivatives.com/


The latter. Like everything else in finance, you have to learn on the job. The good news is that since no one who hasn't worked on Wall St (and even most who have) knows anything about the industry (and that the talent pool is pretty small compared to Silicon Valley, at least for non shit-analyst jobs), they are willing to accept people who don't know much, providing you have the basic chops (basic calculus, statistics, and good programming skills).

I really wish more programmers would get into the industry, since the status quo is pretty terrible. The financial industry needs more quantitative folks so we can kill off the dinosaurs of the "greed is good"/pre-decimalization world :)


I have no idea about it, but Kx Systems have a YouTube playlist updated this year, "Introduction to kdb+ and q" -

https://www.youtube.com/playlist?list=PLypX5sYuDqvrwBD2EMWad...


There seem to be a couple of free versions, a 64-bit and a 32-bit version, with the latter being less restricted than the former.[1]

I was curious about it, but they really don't make it easy to get started. There are no links to any documentation anywhere on their site. The download is hidden behind a licensing agreement and a form asking for the user's personal information. They mention GitHub (in the context of "community support") on their download page, but there are no links to any GitHub repos.

[1] - https://kx.com/connect-with-us/download/


https://code.kx.com/q/ has learning material and documentation . Hope it helps


I highly recommend the “Q for Mortals” book as an excellent introduction to using K and it’s available online:

https://code.kx.com/q4m3/


Gnu APL has multi core support for certain computations and you can fork APL child processes and interact with them. So not quite multi threaded but still very useful for heavy duty work where you want to have a great deal of concurrency.


Yes! I'm imagining a value proposition where APL is as fast as C or Fortran, on distributed systems, because at that point it would probably evaporate most of Matlab, Mathematica, and probably TensorFlow's users.

Can you imagine? APL is an amazingly powerful language. It's easy to write nearly any mathematical function in it. What if it was faster than every other mathematical language? It would probably dominate the market overnight.


You mean Dyalog APL? It already regularly outperforms normal, handwritten C code. It also has support for distributed computing, multi-threading, and the Co-dfns compiler can be used to compile your APL code to the GPU. For example, consider the following talk, which discusses sub-nanosecond lookups/search using the Dyalog APL interpreter.

https://dyalog.tv/Dyalog18/?v=paxIkKBzqBU


Supposedly numpy was inspired in part by J.

I don't think it will ever dominate the market - some things such as FFI remain weak points compared to C.

Plus a lot of what makes Mathematica good is that they have specific algorithms that they've optimized quite a bit, for things like computer algebra &c.


As mruts pointed out above, kdb+/q provides what you're imagining - it provides fantastic distributed computing support via its IPC protocol and also has multi-core support.


Multi-core is priced for big financial institutions, though. Only thing you or I could afford is the 32 bit single-core.


There is a free 64-bit version available with limits that don't seem harsh at all:

The 64-bit kdb+ On-Demand Personal Edition is free for personal, non-commercial use. Currently it may be used on up to 2 computers, and up to a maximum of 16 cores per computer, but is not licensed for use on any cloud – only personal computers. It requires an always-on internet connection to operate, and a license key file, obtainable from ondemand.kx.com


You can start the 32 bit instance with the -s switch (for q to be started with multiple slaves) for parallel execution of a function over data using the "peach" command [0].

But yes, I do agree that the 32 bit instances have limited use, as they can't be used in for-profit projects.

[0] https://code.kx.com/wiki/Reference/peach


Is the pricing published? I haven't been able to find it on the website.


It’s definitely one of those “if you have to ask, you don’t need it” kind of things



> First appeared in SIAM News, 1977-06.


Thanks for pointing that out! Year fixed above.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: