ABSTRACT

The definition of the term “video game” remains unsettled as digital entertainment continues to evolve. Broadly speaking, the term encompasses any entertainment experience powered by electronic logic circuits that requires a player to manipulate an input device to interact with objects presented on a display. By this definition, the development of the first video games coincided with the rise of digital computing in the latter half of the twentieth century. While computers had existed for well over a century by that point, their function had been markedly different and unsuited to playing a game. In the eighteenth and early nineteenth centuries, a computer was merely a person doing basic addition and subtraction to complete mathematical tables. 1 In the late nineteenth century, human computers were augmented by analog devices like Lord Kelvin’s tide predictor that physically simulated specific phenomena through the aid of mechanical devices like levers, pulleys, and gears. In the early twentieth century, the analog computer largely become the domain of the electrical engineer, who used room-sized machines full of resistors, capacitors, and inducers to simulate the operation of power grids as the developed world rapidly electrified. Analog computing reached its high-water mark in 1931 when MIT professor and electrical engineer Vannevar Bush completed his differential analyzer, which could solve a wide array of differential equations, but remained limited to a relatively small set of problems. 2