but they didn't represent lowercase (!). (2⁶ would have allowed you to represent lowercase but you would have to sacrifice a whole lot to do so -- as alphanumerics alone would use 62 positions, leaving you with maybe one position for a space and one punctuation mark, and no newline...)
Apparently EBCDIC derives from IBM's 6-bit BCD codes
and is interesting because it uses 8 bits, successfully represents lowercase, and leaves a ton of (non-contiguous) positions unspecified.
Maybe our standards (no pun intended) are just shifting as we deal with more and more capable software, but I'd be inclined to say that seven bits "easily encode" standard English text, and six don't, on account of the lack of case distinction. (Although you could certainly choose to handle that with control characters, and I'm sure some 6-bit systems did so.)
Yeah once you go to six bits (five is weird because it’s an odd number) you really start looking at seven - and seven is also odd (which is why seven + parity took off for awhile). But now you have eight and that should be good enough for anyone.
Interestingly enough B talks about how the new computer can address a “char” and not just a whole word at a time.
Why was six bits chosen? The modern use of eight bits seems more natural to me, being a power of 2.