Charles Babbage is widely credited coming up with the idea of programmable computers. His idea was sound but the supporting technologies needed to make programmable computers a practical reality were not yet available. Now fast-forward to the mid-20thcentury. In the 1940s and early1950s, the work of von Neumann, Eckert, Mauchly and others resulted in programmable computers using vacuum tubes as the basic building block for making binary decisions (current flow allowed=1, current flow blocked=0). Vacuum tube computers were huge, expensive devices that filled rooms, or even floors, of a building. In addition, they generated massive amounts of heat.
In the 1950’s, the transistor came into widespread use as the basic building block for binary switching. It was orders of magnitude smaller than the vacuum tubes it replaced and consumed a lot less power. Therefore, programmable computer designers could package much more processing power per square inch/cm into an equivalent amount of space as was consumed by the old vacuum tube models. In addition, instead filling up rooms, computers now shared rooms with other computers.
In the mid-1960s, integrated circuits began to appear in products and eventually resulted in the “computer on a chip” that contains millions of transistors, again making orders of magnitude strides in both processing power per square inch/cm and power consumption.
So do we really have new ideas? Or, do advances in technology act as enablers for ideas that may have been floating around for centuries?
Think about this while you are sitting in front of your von Neumann/Eckert/Mauchly/Babbage machine.