Reminds me of an intro to programming book in German that I once had, covering different styles from imperative to OO. Not even that old, late 90s I guess. But I guess the author just had to use C, even if he'd have preferred Pascal. So the very first thing the book does is introduce a boatload of preprocessor macros to make it look like Pascal. A weird way to start out if you had some prior C experience…
Then again, for short academic programs it probably won't cause that much confusion, having running code is more a side-effect of that, so it's a bit like executable pseudocode.
It does make me wonder what's the biggest program out there that's been written with heavy preprocessor abuse. Then again, not sure whether I'd really wanna know…
Octal was far more common than hex back then, due to machines with a wordsize that was a multiple of 3 bits; hence Unix permissions, coming from the 18-bit PDP-7, also inheriting that convention.
"In 1969, Ken Thompson wrote the first UNIX system in assembly language on a PDP-7, then named Unics as a pun on Multics"
CDC "super" computers also had 60-bit words.
Moreover, the original ASCII standard had the 7-bit characters, the improvement over even less bits used for characters before that (6 bits == 2 octal digits). 8-bit bytes weren't something you could expect except when buying IBM (which had EBCDIC, not ASCII).
Separate issues, I think. Octal was used because that's what most everyone used at the time (a holdover from machines like the IBM 70xx and GE600 series which the early Unix folks were used to). Even in V6 & V7 and later, you see much more octal than hex. Unix perms just happened to be 3 groups of 3 bits. Said another way, we'd likely have used octal to represent them even if perms were 2 bits or 4 bits each.
This is an interesting sort of forerunner to another topic in yesterday's HN [1], Cello [2], which appears to be essentially a more thorough, and better-grounded version of Bournegol.
Then again, for short academic programs it probably won't cause that much confusion, having running code is more a side-effect of that, so it's a bit like executable pseudocode.
It does make me wonder what's the biggest program out there that's been written with heavy preprocessor abuse. Then again, not sure whether I'd really wanna know…