Hacker News new | past | comments | ask | show | jobs | submit login

I've just looked at this snippet, which the blog used back in 2017 to illustrate that Algol 68 was "utter madness": https://craftofcoding.wordpress.com/2017/03/06/a-brief-look-...

   COMMENT
   Algol 68 program to calculate the Sieve of Eratosthenes
   for some upper limit N
   COMMENT

   PROC eratosthenes = (INT n) []INT:
   (
       [n]INT sieve;

       FOR i TO UPB sieve DO
           sieve[i] := i
       OD;
       INT k = ENTIER sqrt(n);
       sieve[1] := 0;
       FOR i FROM 2 TO k DO
           IF sieve[i] NE 0 THEN
               FOR j FROM i*i BY i TO n DO
                   sieve[j] := 0
               OD
           FI
       OD;
       sieve
   );

   INT n;
   print("Upper limit to calculate sieve? ");
   read(n);
   print((eratosthenes(n), newline))

A few things: This code is very readable. The blogger didn't understand it because he didn't understand what the "Sieve of Eratosthenes" did: It's a means of calculating lists of prime numbers. He incorrectly stated that it was used for calculating Fibonacci numbers.

Syntax-wise, it looks like some mix of Go, C and Python. Nothing at all unusual. It would be pretty easy to learn to program in this, and comfortable too.

[Edit: The blogger later demonstrated a good understanding of the Sieve of Eratosthenes, but still expressed criticism of Algol68: https://craftofcoding.wordpress.com/2021/04/06/algorithm-35-...]




> This code is very readable

It really is.

> Syntax-wise, it looks like some mix of Go, C and Python. Nothing at all unusual. It would be pretty easy to learn to program in this, and comfortable too.

Or maybe a mix of Pascal, PL/I, and /bin/sh.


/bin/sh, i.e. the Bourne shell, has been written by a programmer who was experienced in ALGOL 68, having implemented some significant projects in it.

In designing the language interpreted by the shell, he has taken several features directly from ALGOL 68.

However he has also done one improvement, renaming the bracket pair keywords "do" and "od" to "do" and "done", responding thus to the criticism of many programmers towards ALGOL 68, that "od" sounds odd.


In particular, one of the projects that Steve Bourne had previously done is the Algol68C compiler at Cambridge. See: https://www.softwarepreservation.org/projects/ALGOL/algol68i...


Steve Bourne wanted to use od to end loops in his shell but was prevented because od was already the octal dump program.


And as other HN commenters have noted recently, Bourne used #define to make the shell source code look like Algol:

https://research.swtch.com/shmacro


I agree, very readable.

One interesting thing: the sieve array does not have static bounds, the size of it is an argument. Presumably that means it’s stack-allocated dynamically a la ”alloca” (I doubt it’s heap-allocated, given there’s no freeing of it). C didn’t get that feature until C99 and adding it has been almost universally been seen as a mistake.

I imagine that’s an example of one of the many, many things that makes Algol-68 a pain to implement. Does make it easier for the developer though.


I've looked up alloca() and it's considered about as safe to use as recursion, and potentially safer than stack-allocating one big fixed-size array. Its bad reputation might be because of poor implementations in certain compilers.


What does ENTIER do? I can read the code, but I have no idea what that means.


From context, it looks like casting to an INT


Makes sense. No idea what the derivation of “ENTIER” is, though.


"entier" means "integer" in French.

In the past, it was customary in the English mathematics texts, especially in the UK, to use the French word for the integer part of a number, because the usage of this function had been borrowed from French mathematicians.

ALGOL 60 had taken many notations from the standard mathematical notation of that time, including operators and function names, so it resembled much more a mathematical text than the American programming languages like Fortran, where the main criterion for designing the syntax was using the restricted character set of the IBM printers, which lacked most mathematical symbols, leading to notations like "*", "/" and "**" for multiplication, division and exponentiation, where ALGOL would use "×", "÷", and "↑".


https://www.softwarepreservation.org/projects/ALGOL/book/pam...

The operator ENTIER (French for “whole”) takes a REAL operand and likewise yields an INT result, but the yield is the largest integer equal to or less than the operand. Thus ENTIER 2.2 yields 2, ENTIER -2.2 yields -3


Isn't Algol the crazy language using special characters all over the place as operators? I could see that being undesirable, but I don't see it in that snippet.

edit: I'm thinking of APL. It looks like it's the same kind of functionality as what the Matlab, R or numpy cohort provides.


Yes, as in your own correction, you were thinking about APL.

ALGOL 68 was certainly more readable than C or Pascal, even if it was somewhat more verbose.

A very good feature of ALGOL 68 was that all the syntactic structures were enclosed in brackets, so no ambiguities or editing errors were possible, like in C or ALGOL 60.

Moreover, each syntactic structure had its own pair of brackets, so you did not have the reading difficulties caused by all bracket pairs being the same, like "()" in LISP, "begin" and "end" in Pascal or "{}" in C.

Especially useful is having different bracket pairs for iterations ("do" and "od" in ALGOL 68, renamed as "do" and "done" in the Bourne shell, which sounds better) in comparison with the bracket pairs used for conditional statements ("if" and "fi" in ALGOL 68).

In C and derived languages, even the indentation is not enough to understand the code in many cases, especially when the lines are restricted to 80 characters, when you have a long loop body that cannot be seen inside a single page.

In ALGOL 68, you have distinct closing brackets for each syntactic structure, so it is easy to recognize their meaning.

Using keywords for brackets is more verbose than in C, but much less verbose than using terminating comments. Moreover, now we can use Unicode for program texts, so instead of keywords one could use different pairs of Unicode brackets that are graphically distinct. For instance one can use angular brackets for conditional statements and S-shaped bag delimiter brackets for iterations.


ALGOL is the main forerunner of C. Anything vaguely C-like is also ALGOL-like.


As I understand it, B[1] was the immediate predecessor of C.

Pascal (and its descendants) and Ada (and VHDL) seem closer syntactically to algol 60 than C is (begin/end as block delimiters, no parentheses needed for control structures, := for assignment, etc.) The example code at [1] (using bold keywords) should be understandable to most HN readers 64 years later.

[1] https://en.wikipedia.org/wiki/B_(programming_language)

[2] https://en.wikipedia.org/wiki/ALGOL_60


B came from BCPL which came from CPL which came from ALGOL 60. The main thing about ALGOL was structured programming as opposed to goto statements. Compared to that begin/end vs braces is a very minor issue.


For a look at how BCPL developed (by being easily ported to new machines), see: https://www.softwarepreservation.org/projects/BCPL


I always thought BCPL would have been a great language for 80s era micros. Fortunately, we had Turbo Pascal.


But C was better because it incorporated byte addressing (BCPL was based on word addresses).


Except the contemporary C compilers were uniformly terrible.


Indeed. And they're all algol descendants as you note.

But there is certainly a difference between c-like syntax (java, javascript...) vs. algol-like syntax (pascal, ada, ...)


And BCPL means Bootstraping CPL, it was never intended to anything beyond bootstraping the CPL compiler that never came to be, as the project folded.


No, BCPL was Basic CPL. BCPL was created in 1967 after the CPL project ground to a halt in 1966, based on the subset of CPL used in the CPL compiler.


Apparently everyone is wrong on Internet.

> BCPL has been rumored to have originally stood for "Bootstrap Cambridge Programming Language", but CPL was never created since development stopped at BCPL, and the acronym was later reinterpreted for the BCPL book.

https://en.m.wikipedia.org/wiki/BCPL


I have seen no evidence of this in papers by Martin Richards, including the earliest BCPL manual or the contemporary papers on CPL by Strachey et al.

The descriptions of how the CPL compiler worked (eg Strachey’s paper on GPM, Richards more recent retrospectives) all talk about writing the compiler in CPL and translating it by hand to macro assembly.

I have not yet managed to look at Richards PhD thesis which contains the first draft description of BCPL, which he sketched after working on the CPL compiler and shortly before moving to MIT where he first implemented BCPL.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: