Over at Ars, Richard Jensen has a great article on how the C programming language came to be. I love dives like this into the various particulars and contingencies involved in the development of things we now think of as more-or-less having “always been there”. It’s easier to do with computer technology than many other things, too, because a lot of the people who were directly involved are still alive, and can talk about why they made one decision or another.
It’s a good reminder, too, that very little of how we generally think “the world is” was inevitable—or is immutable. On the other hand, of course, the longer particular arrangements are accepted as the default without examination of their roots, the more inertia they have, and the more effort it takes to make change.
Just think: we might have ended up with 10-, 12-, even 18-bit “bytes” as the basis for our computing technology; or even trinary logic circuitry, with a three-state “trit” as the smallest unit (after all, at the circuit level, a “0” just means there’s currently 0 volts on the line, and a “1” means there’s 5 volts, but there’s no rule that says other signal levels couldn’t exist).