With all due respect to Mr. Iverson, whom I had the pleasure of meeting at an international APL conference many moons ago, I think J went in the wrong direction.
Part of the power of APL is in the symbolic representation of concepts. The symbols are important. They form a language in more than one sense. Replacing them with ASCII/unicode equivalents rips the guts out of the language.
Here's a thought experiment: Take mathematic expressions and replace every symbol with two or three character ASCII (or whatever) equivalents. The integral symbol might become "integ"; the root symbol become "root"; first, second and third derivative periods become "der1", "der2" and "der3"; and so on. What have you accomplished? Well, at the very least you've ruined the superbly expressive nature of mathematics through agreed upon special symbols.
One could do the same with music and the results would be exactly the same.
In general terms, I believe that we don't need to go in that direction. We need to search for and find the programming equivalent of music and math notation.
If you've ever looked at some VHDL code you have seen an example of just how ugly this can get. Whenever I do FPGA work I use Verilog, which feels very much like C.
Defining a register in Verilog looks something like this:
reg [0:7] my_register;
In VHDL:
my_register: std_logic_vector(7 downto 0);
It's been a while since I touched VHDL so this example could be off. I don't remember if you have to add IN, OUT or INOUT to the declaration.
The point though is that one (Verilog) is compact and to the point whereas the other (VHDL) beats the crap out of you with unnecessary verbosity. My definition of "unnecessary" is that, if the exact same result can be had without the extra stuff then there is absolutely no reason for it to exist. The entire VHDL language is like that, this is just one example. If I remember correctly, there was one thing that was nice about VHDL way-back-when which had to do with parametric generation of code. Later Verilog implementations added these capabilities without making you feel that you were typing a novel.
The long post is to say that notation is important. It is a tool for thought as much as it provides expressive power. J, in my humble opinion, went the wrong way.
To me your comment makes a lot of sense. But you have a visceral feel (that comes from practice) for how useful such notation can be. For people who don't, the cultural stretch is probably just too far to consider.
No doubt even a visionary like Iverson found it difficult to withstand decades of almost-universal criticism about the APL symbol set.
The APL guy in the video remarks wryly at one point that Unicode has made their lives much easier. I'll bet it has!
The analogy to music notation is suggestive. Imagine if people insisted on writing music textually with names like "High C Sharp" or whatever - on the grounds that this was more "readable"!
Replacing them with ASCII/unicode equivalents rips the guts out of the language. … What have you accomplished? Well, at the very least you've ruined the superbly expressive nature of mathematics through agreed upon special symbols.
I really don't think the small number of keystrokes is what makes this sort of language expressive. Cryptic notation is a cost, not a bonus. Moving from, say, J to C, having to write
log(…)
instead of
^. …
(or ⍟) does not make me feel like I've lost anything. The things that do make me feel constrained are having to explicitly write out loops if I want to apply that operation to an array of numbers instead of just one and not having a convenient way to use that operation in function composition.
Indeed, the brevity that comes from a glyph-oriented syntax is orthogonal to that which comes from having array-oriented semantics (e.g., making loops implicit). You could have a word-based syntax, and still have very concise code.
q uses words for its single-argument operators, using "til 10" rather than "!10". It certainly makes a better first impression, but I'm not convinced it matters after a few days. My experience as an enthusiastic k novice is that I recognize idioms from symbols (e.g. "{x@<x}" more directly than I do in words ("fun x at grade-up x end"). Perhaps it's like reading poetry in Kanji, rather than letters?
Part of the power of APL is in the symbolic representation of concepts. The symbols are important. They form a language in more than one sense. Replacing them with ASCII/unicode equivalents rips the guts out of the language.
Here's a thought experiment: Take mathematic expressions and replace every symbol with two or three character ASCII (or whatever) equivalents. The integral symbol might become "integ"; the root symbol become "root"; first, second and third derivative periods become "der1", "der2" and "der3"; and so on. What have you accomplished? Well, at the very least you've ruined the superbly expressive nature of mathematics through agreed upon special symbols.
One could do the same with music and the results would be exactly the same.
In general terms, I believe that we don't need to go in that direction. We need to search for and find the programming equivalent of music and math notation.
If you've ever looked at some VHDL code you have seen an example of just how ugly this can get. Whenever I do FPGA work I use Verilog, which feels very much like C.
Defining a register in Verilog looks something like this:
In VHDL: It's been a while since I touched VHDL so this example could be off. I don't remember if you have to add IN, OUT or INOUT to the declaration.The point though is that one (Verilog) is compact and to the point whereas the other (VHDL) beats the crap out of you with unnecessary verbosity. My definition of "unnecessary" is that, if the exact same result can be had without the extra stuff then there is absolutely no reason for it to exist. The entire VHDL language is like that, this is just one example. If I remember correctly, there was one thing that was nice about VHDL way-back-when which had to do with parametric generation of code. Later Verilog implementations added these capabilities without making you feel that you were typing a novel.
The long post is to say that notation is important. It is a tool for thought as much as it provides expressive power. J, in my humble opinion, went the wrong way.