ABSTRACT

Noisy data is characterized by its noise level, as a measure of noise-to-signal ratio. Noise intensity in the data plays an important role in obtaining the deep minimum. This chapter gives an analogy between the basic concepts of information theory and self-organization theory in identifying the processes. The main purpose of the analogy is to show the possibility of the exchange of basic ideas between these theories. The greater the signal power in comparison with the noise variance, the greater will be the attainable transmission capacity. Thus, Shannon's second theorem establishes a bound for the transmission capacity of the communication system that is attainable for optimal choice of the coding method and channel band. The properties of a communication system are determined by the value of its redundance. All the external criteria used in the inductive algorithms and having the quadratic form can be grouped into accuracy criteria and matching (consistent) criteria.