Hacker News new | past | comments | ask | show | jobs | submit login

7 bits was enough for ASCII and lower case, why not 7?



I'd guess because it's not an even number. I don't know why even number of bits were considered infeasible, but there hasn't been a single computer architecture with odd number of bits as word length: https://en.wikipedia.org/wiki/Word_(computer_architecture)


??? The table on that article contains plenty of architectures with odd bits per word, for example the Apollo Guidance Computer (15). Odd bits often resulted from sign or parity bits. And I'm shocked to see decimal digits were sometimes encoded as 5 to 7 bits to a digit (bi-quinary coded decimal) rather than 4 (binary coded decimal), e.g. in the IBM 650 (10 digits and a sign bit), which used 71 bits per word - a prime number! Of course "bit" is not the right term here, as software can't access them. But there are 71 physical switches exposed to the user for input.


That might give a clue: An 8-bit word can fit two bcd digits.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: