ABSTRACT

Logical inferences and mathematical induction played a central rule in the ultimate philosophy of computing theory contributed by Charles Babbage, Alan M. Turing (1936, 1950), and John von Neumann (1946, 1958, 1963, 1966). The fundamental objects of computation are abstracted by binary digits (bits). Any real-world data object is seen to be able to be reduced to bits – the most fundamental and general form of representation of real-world objects and data. As a consequence on the basis of this profound axiom, computation methods in general are perceived to be based on the

basic arithmetical and logical operations on bits known as Boolean algebra. Any other complex operations must be reduced to these kinds of basic forms of operations in computing. In addition, computing resources are dramatically simplified to the form of finite or infinite sequential memory of bits and characters.