ABSTRACT

Quantum physics joined the mainstream in the 1920s and 1930s with the general acceptance of the theories of Max Planck, Albert Einstein, Niels Bohr (1885-1962), Erwin Schrödinger (1887-1961), Werner Heisenberg (1901-1976), Paul Dirac (1902-1984), and many other established physicists. Quantum computing is based on quantum physics, with all of its special behaviors and unusual limits. Therefore, quantum computing deals with the behaviors of atomic and subatomic particles. These behaviors are irreducibly random and the measurement of particle characteristics

simultaneously, such as position and momentum, to an arbitrary precision is impossible. That is, physical phenomena of small particles do not agree often with our classical intuition. The unusual behavior results from features of quantum mechanics called superposition and interference. In the early 1980s, Richard Feynman (1918-1988) noted that there seemed to be fundamental difficulties in simulating quantum mechanical systems on digital computers, and further suggested that having computers based on the principles of quantum mechanics would overcome those difficulties. Devices that perform quantum information processing are known as quantum computers.