Hacker News new | past | comments | ask | show | jobs | submit login

36 bit was popular for scientific computing because 35 gives you a sign bit and 10 decimal digits (yes, that's weird, but that's the argument I read everywhere, including at https://en.m.wikipedia.org/wiki/36-bit. 35 likely was skipped because it its only divisors are 5 and 7, limiting instructions to 7 bits even then was felt to be too restricting. For some architectures, the DoD had a say in this, too. See https://en.m.wikipedia.org/wiki/Unisys_2200_Series_system_ar...)

36 bits got us the 6-bit character (10 digits, 26 letters, and punctuation) with six characters in a word. Because of that, some OSes had six-character file names.

If you want to get upper- and lowercase, you need more than 6 bits. 9 is the smallest divisor larger than 6 of 36, so nine-bit characters made sense.

On such systems, file names could still use 6-bit characters, while applications used 9-bit ones. Also, some instructions could work on words, half words, quarter words, or sixth words.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: