ABSTRACT

Though mathematical ideas underpin the study of neural networks, the author presents the fundamentals without the full mathematical apparatus. All aspects of the field are tackled, including artificial neurons as models of their real counterparts; the geometry of network action in pattern space; gradient descent methods, including back-propagation; associative memory and Hopfield nets; and self-organization and feature maps. The traditionally difficult topic of adaptive resonance theory is clarified within a hierarchical description of its operation. The book also includes several real-world examples to provide a concrete focus. This should enhance its appeal to those involved in the design, construction and management of networks in commercial environments and who wish to improve their understanding of network simulator packages. As a comprehensive and highly accessible introduction to one of the most important topics in cognitive and computer science, this volume should interest a wide range of readers, both students and professionals, in cognitive science, psychology, computer science and electrical engineering.

chapter 1|4 pages

Neural networks—an overview

chapter 2|11 pages

Real and artificial neurons

chapter 3|9 pages

TLUs, linear separability and vectors

chapter 4|9 pages

Training TLUs: the perceptron rule

chapter 5|7 pages

The delta rule

chapter 6|16 pages

Multilayer nets and backpropagation

chapter 7|13 pages

Associative memories: the Hopfield net

chapter 8|19 pages

Self-organization

chapter 9|12 pages

Adaptive resonance theory: ART

chapter 11|11 pages

Taxonomies, contexts and hierarchies