Standards and Information Technology's Legacy Problem

Note: This post has been automatically imported from my old blog. Formatting may be incorrect.

The IA64 processor had to be largely abandoned because consumers wanted a 64-bit processor that was compatible with their existing 32-bit programs. Our modern, 64-bit personal computer processors are therefore still backwards-compatible with binaries compiled for 16-bit processors, which, for Intel family processors, were obsoleted with the 80386 processor in 1985. Resources have been specifically set aside to maintain this backwards compatibility, and the x64 assembly languages have additional complexity.

The vast majority of non-Apple personal computers boot using the BIOS system. Originally designed for computers in the 1980s, BIOS code is 16-bit, typically written in complex assembly, and can't easily interface with pointer devices. The BIOS boot process relies on assumptions about disk and bootloader size that were true in 1980 but are laughable today, and as a result modern bootloaders rely on complex tricks (like storing the bootloader program in multiple stages) to boot up operating systems. For example, BIOS-readable disks can't be any larger than 2 terabytes, and can't have more than four real partitions. While there have been some extensions, the essence of the BIOS is still the same today as it was in 1980.

Standard web URLs begin with http://. Tim Berners-Lee, essentially the inventor of the world-wide web, has said that the two slashes are completely unnecessary, but we will be stuck with them forever, two extra characters taking up small amounts of bandwidth, storage space, and typing time for no gain.

The standard C library contains several functions (such as gets()) which are known to be security risks and should never be used. Nevertheless, because older code uses them, the POSIX standard still mandates that the functions be provided by the C library, and so every Unix-like system still has them.

I could go on and on with examples of how the decisions and limitations of the past still affect us today in the computer world, but I think I've made my point. Almost every new standard or library is plagued with the question: Will we be backwards compatible? New hardware is stuck with old interfaces, new programming paradigms are stuck with old syntax, etc. Every time I think about this issue, three questions come to mind:

  1. If we were to start fresh today, destroying every old computer and piece of code and removing the need to be compatible with any thing or attitude that existed before May 13th, 2011, would we be able to design our standards, interfaces, etc. in such a way as to allow for gradual improvement without permanently saddling future improvements with current shortcomings?
  2. If the answer to 1 is "yes", how can we get to a similar point without a single cutting-off of the past?
  3. Do other fields suffer from similar problems? Do architects have to deal with hold-overs from how castles were built, do industrial farmers have to deal with hold-overs from the hand-plow, etc.?

Thoughts?