ABSTRACT
Electrical and magnetic fields have been known in mathematical
form since the laws of Coulomb and Ampere were discovered in
the late 17th and early 18th centuries. Applying to macroscopic
domains, modern atomic physics was in its infancy at that time.
Both laws assume charge separation in a single nonlinear function,
the square root of the sum of the squares of orthogonal distances
involved in their separation.Within atomic andmolecular theory the
Pythagorean concept of distance has been utilized by both classical
electromagnetics and quantum theory. Einstein’s relativity gave
the first hint that in some phenomena, separations in orthogonal
directions do not couple but stay separate. Thus electromagnetic
fields in atoms consist of two fields each causing the atomic
particles to rotate in orthogonal planes. If charge separation
includes both centres of rotation, electromagnetics can analytically
solve for the motions. In this chapter we examine how classical
electromagnetics failed at the atomic level and quantum theories
were deemed necessary, thus dominating 20th-century physics. The
classical fields were responsible for this historical failure. A selective
sweep across scientific knowledge provides a preview of self-field
theory.