ABSTRACT

Neural networks (NN) are intimately associated with supervised machine learning and, in particular, with deep learning. But these concepts are orthogonal and have emerged independently, at different times. Historically, the first learning algorithm on NNs came more than a decade after NNs were formulated. This chapter separates the concept of neural networks from the concept of learning from input-output examples. At the center of neural networks, there is the concept of a neuron. In its mathematical model, a neuron is the realization of a function that receives N inputs, adds them together in a weighted manner, and activates a response when the resulting value meets certain conditions. Neural networks consist of many neurons connected in some fashion. In deep learning, neurons are organized in layers, where neurons in the same layer perform the same function on the same inputs, albeit with different weights.