ABSTRACT

The early computer designers were well aware that their designs incorporated compromises to mathematical correctness, but costs prohibited the building of correct numerics. When the first electronic, automatic computers were built starting around 1940, a logic gate cost tens of dollars and a single bit of storage cost over a dollar, in 2014 dollars. As long as the programmer was someone as bright as John von Neumann, the errors were manageable; the early computers were used by a “priesthood” and it was never envisioned that someday billions of people would rely on automatic computers, with over 99.99% of those users completely clueless about the hazards of high-speed numerical errors. They also never envisioned that an hour of human talent would eventually cost hundreds of times more than an hour of time on a very powerful computer, so it was reasonable to ask humans to bear all the burden of creating correct methods for performing calculations.