Hacker News new | past | comments | ask | show | jobs | submit login

My research is in debuggers, so the refs are debuggery.

The Diagnosis of Mistakes in Programmes on the EDSAC, in my opinion, the classic software engineering paper. It was written in '51 or so. Pretty much every major technique in debugging was invented then except for the proof checking that started to come out in the '80s. http://sal.cs.bris.ac.uk/~dave/gill.pdf

The history of early PDP debuggers is fascinating. Look up FLIT and DDT for the TX-0/1 and the PDP-1.

Also take a look at Balzer's EXDAMS paper from AFIPS '69.

Evans and Darley had an interesting survey as well in AFIPS '66.

Production of Large Computer Programs by Benington describes a really nice simulator (early 50s).

Backus invented an interpreter and described it in "The IBM 701 Speedcoding system". http://www.cs.virginia.edu/~cs415/reading/backus-speedcoding...

The guy over at Bitsavers has some old works there: http://www.bitsavers.org/pdf/mit/tx-0/memos/_MemoIndex.txt

A few gems that I've found and I think these references will supply: Lisp had amazing debugging capabilities in the mid-60s. Fully comparable to Visual Studio 2005 IMO. In the same timeframe, Fortran had an interpreter (aka QUIKTRAN iirc) that would provide analysis and code coverage stats.

I have had enormous issues digging up any information about the 50s corporate computer capabilities; most of the information I have been able to find is related to MIT computer work. As sort of a personal commentary, I've slewed hard towards open-source because of lack of decent corporate records from this period. If we don't open source our work, how will later generations build on it?

If you want my bibliography, send me an email and I'll give you a copy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: