ABSTRACT

This chapter covers the past and present development of connectionism in three sections: "Roots", "Revolution", and "Radiation". It includes Warren S. McCulloch and Walter Pitts' demonstration that their nets can compute any logical function, Rosenblatt's perceptrons, and Donald Hebb's learning rule. The chapter ends with Marvin Minsky and Papert's famous complaint that perceptrons and other simple architectures cannot calculate certain Boolean functions. It sets the stage for a discussion of implementational vs radical interpretations of connectionist modeling. The chapter describes the innovations that led to the Parallel Distributed Processing (PDP) revolution of the 1980s, including sigmoidal and other activation functions, backpropagation, multi-level nets, and the introduction of simple recurrence. It explains the new enthusiasm for these models, especially for the power of distributed representations that they employ, and ends with problems that influenced further developments, including the biological implausibility of backpropagation and scaling problems like catastrophic interference.