ABSTRACT

A neural complex, real or artificial, is an embodiment of a massively connected set of neurons and represents a cellular automaton “trained to learn” and predict via endeavors managed by a set of protocols involving collection, conversion, transmission, storage and retrieval of information. The training or learning effort is to recognize and counter-balance x the effects of the cellular disturbances present in neural system which may tend to disorganize the system’s convergence towards an objective function mediated through learning protocols. The “complexity” vis-a-vis neural networks generally refers to the strain on computational power of the network as dictated by the architectural considerations and spatiotemporal aspects of neural state proliferation across the network. The architectural aspect of neural complexity in reference to artificial neural networks has also been specified in terms of the “size” of the network representing the number of cellular elements, and the “depth” of the network measuring the longest path from any input gate(s) to the output gate(s).