Hacker News new | past | comments | ask | show | jobs | submit login
Bournegol – Algol-like dialect of C used to write the original Bourne shell (schmorp.de)
50 points by carljoseph on Dec 27, 2014 | hide | past | favorite | 9 comments



Reminds me of an intro to programming book in German that I once had, covering different styles from imperative to OO. Not even that old, late 90s I guess. But I guess the author just had to use C, even if he'd have preferred Pascal. So the very first thing the book does is introduce a boatload of preprocessor macros to make it look like Pascal. A weird way to start out if you had some prior C experience…

Then again, for short academic programs it probably won't cause that much confusion, having running code is more a side-effect of that, so it's a bit like executable pseudocode.

It does make me wonder what's the biggest program out there that's been written with heavy preprocessor abuse. Then again, not sure whether I'd really wanna know…


I thought it was amusing until I read that:

    #define LOBYTE      0377
    #define STRIP       0177
    #define QUOTE       0200
I can deal with preprocessor abuse but octal crosses the line...

Also, in case anybody would have the bad idea to take inspiration from that code, don't do that:

    #define MAX(a,b)	((a)>(b)?(a):(b))
It evaluates the macro parameters twice.

Although, on second thought, don't do anything like that code anyway.


Octal was far more common than hex back then, due to machines with a wordsize that was a multiple of 3 bits; hence Unix permissions, coming from the 18-bit PDP-7, also inheriting that convention.


Interesting, you have a source for that?

I thought permissions were given in octal just because they're groups of 3 bits (owner/group/other) so it aligns with octal correctly.


http://en.wikipedia.org/wiki/Programmed_Data_Processor

   PDP-1  18 bit
   PDP-2  24 bit
   PDP-3  36 bit
   PDP-4  18 bit
   PDP-5  12 bit
   PDP-6  36 bit
   PDP-7  18 bit
http://en.wikipedia.org/wiki/PDP-7

"In 1969, Ken Thompson wrote the first UNIX system in assembly language on a PDP-7, then named Unics as a pun on Multics"

CDC "super" computers also had 60-bit words.

Moreover, the original ASCII standard had the 7-bit characters, the improvement over even less bits used for characters before that (6 bits == 2 octal digits). 8-bit bytes weren't something you could expect except when buying IBM (which had EBCDIC, not ASCII).


Separate issues, I think. Octal was used because that's what most everyone used at the time (a holdover from machines like the IBM 70xx and GE600 series which the early Unix folks were used to). Even in V6 & V7 and later, you see much more octal than hex. Unix perms just happened to be 3 groups of 3 bits. Said another way, we'd likely have used octal to represent them even if perms were 2 bits or 4 bits each.


This is an interesting sort of forerunner to another topic in yesterday's HN [1], Cello [2], which appears to be essentially a more thorough, and better-grounded version of Bournegol.

[1] https://news.ycombinator.com/item?id=8799070

[2] http://libcello.org/


Wonder the rational for

  #define TRUE (-1)
  #define FALSE 0
instead of

  #define TRUE (1==1)
  #define FALSE !TRUE
edit: re-formatting


replying to myself, −1 is 1111 1111 in 8-bit two's-complement integer




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: