The episode starts of like a normal interview about software engineering interview struggles, but once it detours into APL territory it becomes pretty damn interesting.
The APL book is a classic programming language book. Expensive to acquire, though. $150+ for a used copy.
I would be genuinely interested of the advantages of APL over languages like Julia or Python with numpy who have also quite flexible array types. For instance in Julia, I could write:
data = [1,2,3]
# . means element-wise operation
sum(exp.(filter(isodd,data)))
# or, where x |> f is f(x)
data |> x -> filter(isodd,x) .|> exp |> sum
Which looks quite clean to me. But Julia does not have partial function evaluation (just using filter(isodd) above would not work). Is this one of the main differences?
One reason for learning APL mentioned in the interview is learning to solve problems without branching.
Conor's knowledge of solving problems in a data parallel fashion using masks and such in APL helped him with his day job of building GPU accelerated data-science tools at NVidia.
But it's not just a NVidia thing - CPUs can do data parallel operations and also get slowed down by branching. I don't think this means that you should write everything in an array language, but it means that learning how to solve problems under those constraints will teach you important lessons.
You could do worse than look at user "geocar"'s comment history on HN; he mostly talks about the array language K rather than APL, but he argues a lot about code size, conciceness, readability, expressivity, etc. more than individual operators or functions. e.g. these and the comment trees around them:
I am always fascinated by people complaining about decades old programming languages which are still alive and well. Some of those commenters seem to be trying really hard to prove APL and it's offspring are bad/wrong/etc., but history tells us otherwise. Even Numpy relies on the APL family as one of the language groups which guided its design.
That image manipulation example is really cool! Not knowing much of either J or APL, I do like how the non-ascii symbols can visually represent the operation in APL, but using ASCII does seem more practical.
There are other things aside from array broadcasting that languages stand to learn from APL. I think that the biggest one is looking carefully at the clever choice of primitives.
To my knowledge other languages like Julia and numpy don't offer the same range of index-based primitives, like grades, or classifications, which are used pervasively in APLs.
APLs better support a pattern of computing some permutation of a set of arrays, and then viewing other arrays through that permutation, than other languages. You can do it, of course, in numpy, but it's not nearly as pleasant.
For starters, it can be argued that it remains closer to mathematical notation. Example:
reduce(+, (j -> j*2^-j).(Float64.(1:n)))
which can be written, but then kinda differs from APL approach of combining primitives
sum((j -> j*2^-j).(Float64.(1:n)))
compared to
+/j×2*-j←⍳n
Also the statement is more succinct, and that is better understood when you compare an APL one-liner with a multi-line code in the other two languages. I think this is its true strength.
I find it weird that that's called "modern" Fortran when that came with Fortran 90. Then I remember that even in 2010 academics (perhaps one of the biggest producers of Fortran today) still seemed to be hung up on F77.
You are right that the Fortran community took a long time adopt standards later than F77. Gfortran handles Fortran 90 and 95 well now, and I think F2003 and F2008 (although I don't use too many features from those standards and cannot say for sure from personal experience). I don't remember how mature gfortran was in 2010. Maybe then g95 was more mature.
Is there some sort of special mechanism for partial application in APL that isn’t just defining a partial operator, like a macro or function? It’s not too hard to do yourself, and I’m sure someone’s already written it.
I wouldn’t be surprised if there was a performance hit relative to a language with partial evaluation as a design feature.
Conor: ...it’s not C, it’s like a macro variant of C, where it’s a CDSL, where 80% of your “library” is macros. I did a search, there’s 10,000 macros in the source code, and those macros are used like functions ... some of the exact same macros that exist in the J code base are exactly what competitive programmers use.
He is, of course, talking about things like this (from the "J incunabulum"):
#define DO(n,x) {I i=0,_n=(n);for(;i<_n;++i){x;}}
Which is actually pretty concise and readable. Sometimes I think it would be nice if enough people could agree on a header library of "concise C" macros that they could become part of the idiom. Or at least a minority dialect instead of an idiolect.
This still makes you put the execution as a macro argument, but the benefit comes from the fact that most of the time a loop can just go from 0 to a given number.
If just the for loop is in the macro you can put the braces in yourself.
If the braces are up to you, nested loops are cleaner.
My first job as Software Engineer, the recruiter found me because I participated in ICPC contests, I don't know how common it is today but it happened ~10 years ago.
I have heard a similar story or two and they all have involved google.
I'm not sure when these leetcode style interviews became pervasive but it makes sense that recruiters would figure out quickly where to find people that would excel at that interview process.
Host here. I came into this knowing more about competitive coding than APL but I found both interesting. The way these ICPC teams compete as a group and train is fascinating to me but this connection between APL and GPU programming was really something cool that I was not aware of.
Hi Adam. Thanks for doing the podcast. Listening to it is easily one of the most enjoyable parts of my life. I have also learnt so much about coding in general which I am pretty sure I would have never learnt.
I'm working on a language called BQN which I think does just this. The design is done, and we are mainly working on performance and testing for the C implementation (it's also missing some syntax features, but they are all things that don't exist in APL and should be added soon).
The APL book is a classic programming language book. Expensive to acquire, though. $150+ for a used copy.