I learned APL in college and used it professionally for about eight years. I really enjoyed the language. Once you know it well enough it creates a very interesting flow condition. You are able to focus on the problem rather than the mechanics of solving it. It's hard to describe but you definitely feel the difference when going from APL to something like C, Forth or Lisp.
One of the huge problems with APL back then was that workspaces (the memory and resource sandbox your work is constrained to) were limited in size. The other was the trouble you had to go through in the early days to be able to see and print the character set. On early IBM PC versions you had to change the character generator ROM. You also had to buy very specific printers for which you could buy APL print heads (for example, IBM Selectric).
Today these issues are gone and workspaces are large enough to tackle the vast majority of problems.
There is, however, in my opinion, a serious problem with APL today: It is too expensive.
Yes, there are low cost or even free APL's, but the ones you want will run you $1,000 to $2,000 per seat. That's ridiculous. If there's a language that would have benefited from FOSS, this is it. I strongly believe that APL adoption would be far wider today if it had a strong FOSS offering.
The other issue is that, probably because it didn't go FOSS, the language didn't evolve into OO. I think it needs to. It also needs a few additional mutations to make it easier to work with and deploy. APL for the web, as a wild thought, could be a very interesting idea. The compression of code would be magnificent to see.
I haven't used APL in quite some time but would jump on it if it made sense. Back in the day I used APL for projects ranging from robotics, to business databases, DNA sequencing and astronomical calculations. It was fun.
What about J? J was developed Ken Iverson (the creator of APL) as a successor to APL that replaced unicode characters with two character equivalents. It's not greatly publicized, but the latest J was released under the GPLv3 (http://www.jsoftware.com/source.htm).
With all due respect to Mr. Iverson, whom I had the pleasure of meeting at an international APL conference many moons ago, I think J went in the wrong direction.
Part of the power of APL is in the symbolic representation of concepts. The symbols are important. They form a language in more than one sense. Replacing them with ASCII/unicode equivalents rips the guts out of the language.
Here's a thought experiment: Take mathematic expressions and replace every symbol with two or three character ASCII (or whatever) equivalents. The integral symbol might become "integ"; the root symbol become "root"; first, second and third derivative periods become "der1", "der2" and "der3"; and so on. What have you accomplished? Well, at the very least you've ruined the superbly expressive nature of mathematics through agreed upon special symbols.
One could do the same with music and the results would be exactly the same.
In general terms, I believe that we don't need to go in that direction. We need to search for and find the programming equivalent of music and math notation.
If you've ever looked at some VHDL code you have seen an example of just how ugly this can get. Whenever I do FPGA work I use Verilog, which feels very much like C.
Defining a register in Verilog looks something like this:
reg [0:7] my_register;
In VHDL:
my_register: std_logic_vector(7 downto 0);
It's been a while since I touched VHDL so this example could be off. I don't remember if you have to add IN, OUT or INOUT to the declaration.
The point though is that one (Verilog) is compact and to the point whereas the other (VHDL) beats the crap out of you with unnecessary verbosity. My definition of "unnecessary" is that, if the exact same result can be had without the extra stuff then there is absolutely no reason for it to exist. The entire VHDL language is like that, this is just one example. If I remember correctly, there was one thing that was nice about VHDL way-back-when which had to do with parametric generation of code. Later Verilog implementations added these capabilities without making you feel that you were typing a novel.
The long post is to say that notation is important. It is a tool for thought as much as it provides expressive power. J, in my humble opinion, went the wrong way.
To me your comment makes a lot of sense. But you have a visceral feel (that comes from practice) for how useful such notation can be. For people who don't, the cultural stretch is probably just too far to consider.
No doubt even a visionary like Iverson found it difficult to withstand decades of almost-universal criticism about the APL symbol set.
The APL guy in the video remarks wryly at one point that Unicode has made their lives much easier. I'll bet it has!
The analogy to music notation is suggestive. Imagine if people insisted on writing music textually with names like "High C Sharp" or whatever - on the grounds that this was more "readable"!
Replacing them with ASCII/unicode equivalents rips the guts out of the language. … What have you accomplished? Well, at the very least you've ruined the superbly expressive nature of mathematics through agreed upon special symbols.
I really don't think the small number of keystrokes is what makes this sort of language expressive. Cryptic notation is a cost, not a bonus. Moving from, say, J to C, having to write
log(…)
instead of
^. …
(or ⍟) does not make me feel like I've lost anything. The things that do make me feel constrained are having to explicitly write out loops if I want to apply that operation to an array of numbers instead of just one and not having a convenient way to use that operation in function composition.
Indeed, the brevity that comes from a glyph-oriented syntax is orthogonal to that which comes from having array-oriented semantics (e.g., making loops implicit). You could have a word-based syntax, and still have very concise code.
q uses words for its single-argument operators, using "til 10" rather than "!10". It certainly makes a better first impression, but I'm not convinced it matters after a few days. My experience as an enthusiastic k novice is that I recognize idioms from symbols (e.g. "{x@<x}" more directly than I do in words ("fun x at grade-up x end"). Perhaps it's like reading poetry in Kanji, rather than letters?
Morgan Stanley used a variant of APL called A+. It was used quite extensively in parts of the firm. They open-sourced it a while back. http://www.aplusdev.org/
I don't remember details, but my colleagues at Merrill Lynch used to say you could look at somebody's APL code and tell if they came from Sharp (Toronto), IBM (Yorktown Heights or San Jose CA), STSC or Morgan Stanley.
In APL2, the equivalent to a lisp map was "each", it was a colon rotated to be horizontal and superscript. We used to have "each" contests, I always got destroyed.
And one of my (distant) friends got a summer job in high school, working for Iverson in Yorktown Heights, writing APL.
Oh, right. execve replaces the existing process with the invoked one, so I'm not sure that use case makes sense. Usually execve is used with fork, too.
If you're concerned with Windows portability, mmap(2) is a bigger problem. Otherwise, not sure why you're asking...?
Sometimes I don't want to fork, e.g., if there's nothing waiting for input, e.g., the last program in a pipe.
prog |prog |exec prog
The usual way to do this is to put it in a shell script, then run the shell script. Though it would be nice in some instances, you can't run exec from a shell prompt or you'll exit the shell. I was thinking that if I'm in the kona console, executed from a parent shell, then I could type exec from the console and run programs without forking, because it's not going to cause kona to exit. Wishful thinking.
To stop forking, you would have to add an option to call execve() instead of system() in the kona source.
It’s certainly a cool language, but this presentation doesn’t really convey it very well. When you want to show me how a language is useful, I absolutely want to hear the boring details, examples of how to manipulate census data, how to serve a web page, and all that. But when you want to show me how a language is cool, I really just want to see gimmicks: neat tricks you can do with the array operations, ways of composing things that are easy in APL that wouldn’t even occur to you in another language, that sort of thing.
I actually got the impression, from the way that Morten put it, that APL is not a terribly good environment for collaboration. It seems best suited to a single person having a “conversation”, if you will, with the APL environment—not necessarily with other developers. I’ve never written anything in APL, let alone in a team, so I could be way off. But then again, if it’s so productive, I’m not sure how much it actually matters if only one guy is working on the software at a time.
Maybe software is only built collaboratively because the tools aren’t powerful enough to let a single developer manage all the complexity. If you have better tools, perhaps you simply don’t need to work in a team.
I've always found APL really intriguing. A professor of mine told me about how one of his students used to be able to solve ACM competition problems incredibly fast by simply not having to type very much. There are some insane videos on youtube of people solving problems in APL [1].
But, you know - it's write once, read.. never again.
I can read k, which is similarly dense, but limited to ASCII. You may need to read it a bit more slowly, like math, but it is readable. (I can't read Thai, either, but that doesn't make it unreadable.)
APL was originally designed as a uniform mathematical notation. It was only implemented as a programming language some time later. I don't think classic APL has a future, but there are ideas there worth learning.
To me the biggest lesson from APL is to always try to manipulate the largest data structure you can your hands on. For example, don't think about for-loops, think about 'map'. When you can do that, life is good.
The limitations of the language that I saw (in the mid 90s) were around interoperability. Devs saw it as esoteric and steered toward something more 'marketable' like C++ and later Java. The extensive use of memory mapped files for persistence made it 'viral' in terms of building apps to access data. The language (or more likely the library support) was great for pricing algorithms but not so great for business applications filled with corner cases. As I recall these were reasons it was phased out at Morgan Stanley.
It was used heavily in the fixed income division. I was there during the phase out and among the A-Plus developers there was an "...out of my cold, dead hands" mentality. They (and there were a lot of them) really loved it.
Very true. And a few remaining A+ devs ended up with high job security since there were some systems they were very slow to redo and they were afraid something might break.
I was stuck debugging a C++ application that was just using A+ objects as its wire protocol. I never even managed to get the fonts to display properly. It was horrible. I ended up having to write my own print-these-boxed-values routines, just to see what was going on.
A few years later, and we were using kdb+ and q to develop major analytics. That was actually a lot of fun.
I recently bought "APL an interactive approach" and read it. It seems that most of the good stuff from APL is already present in modern languages, and better when it's lazy. The OO stuff in APL looks kind of crufty. Still it's a wonderful fascinating language. In the age of mature open-source it feels very old fashioned with the expensiveness of a decent implementation. Btw, this is a fantastic tv segment from days of yore "The Origins of APL - 1974": http://www.youtube.com/watch?v=8kUQWuK1L4w
I may be being a bit stupid here, but how do you advance the slides on that link? I've never been able to figure out how to use InfoQ presentations. I must be doing something wrong...
I don't know if there's a way to advance them manually, but they are in sync with the video. You should able to read them seeking through the timeline.
One of the huge problems with APL back then was that workspaces (the memory and resource sandbox your work is constrained to) were limited in size. The other was the trouble you had to go through in the early days to be able to see and print the character set. On early IBM PC versions you had to change the character generator ROM. You also had to buy very specific printers for which you could buy APL print heads (for example, IBM Selectric).
Today these issues are gone and workspaces are large enough to tackle the vast majority of problems.
There is, however, in my opinion, a serious problem with APL today: It is too expensive.
Yes, there are low cost or even free APL's, but the ones you want will run you $1,000 to $2,000 per seat. That's ridiculous. If there's a language that would have benefited from FOSS, this is it. I strongly believe that APL adoption would be far wider today if it had a strong FOSS offering.
The other issue is that, probably because it didn't go FOSS, the language didn't evolve into OO. I think it needs to. It also needs a few additional mutations to make it easier to work with and deploy. APL for the web, as a wild thought, could be a very interesting idea. The compression of code would be magnificent to see.
I haven't used APL in quite some time but would jump on it if it made sense. Back in the day I used APL for projects ranging from robotics, to business databases, DNA sequencing and astronomical calculations. It was fun.