??? The table on that article contains plenty of architectures with odd bits per word, for example the Apollo Guidance Computer (15). Odd bits often resulted from sign or parity bits. And I'm shocked to see decimal digits were sometimes encoded as 5 to 7 bits to a digit (bi-quinary coded decimal) rather than 4 (binary coded decimal), e.g. in the IBM 650 (10 digits and a sign bit), which used 71 bits per word - a prime number! Of course "bit" is not the right term here, as software can't access them. But there are 71 physical switches exposed to the user for input.