Modular Version of Neural Networks for Deep Learning
This chapter takes a somewhat different perspective on the structure of neural networks. It begins the modular approach with a focus on backpropagation and starts that with a review of approach to backpropagation. The hidden layer nodes have consisted of two separate processes: the calculation of Z and the calculation of A. The chapter teaches how to think about how to divide programming task into appropriate classes. With the focus on layers as the fundamental object, it is easy to create new layers and connect them. Backpropagation method also have the same kind of design: accepting information, processing it, and passing it out. Perhaps human neurons also have the ability in the sense that they must have some capacity for learning. The layer class’s backpropagation method is designed to accept information, process it, and pass it out downstream.