Symmetric Weights and Deep Belief Networks
The algorithms based on neurons that we have seen have been asymmetric in the sense that the values of the inputs and weights made the neurons fire, or not, but the values of the neurons would never affect the inputs. There is a continuous version of the Hopfield network, which can be adapted to deal with more interesting data, such as grayscale images. The original Boltzmann machine is fully connected just like the Hopfield network. However, in addition to the neurons where the input can be specified, which are termed visible neurons, there are also hidden neurons, which are free to take on any role in the computational model. The restricted Boltzmann machine performs pattern completion, just like the Hopfield network, so that after training, when a corrupted or partial input is shown to the network the network relaxes to a lower energy trained state.