Learning in reference to the real neural complex depicts progressive modifications occurring at the synaptic levels of the interconnected neurons. The presence of inherent intraneural disturbances or any extraneural noise in the input data and/or in the teacher values may, however, affect such synaptic modifications specified by the set of weighting vectors of the interconnections. The noise considerations, when translated to artificial neurons refer to inducing an offset in the convergence performance of the network in striving to reach the goal or the objective value via the supervised learning procedure implemented. Pertinent to learning processes in the neural complex, it is well-known that synaptic modifications could be influenced by the inevitable presence of intraneural disturbances which would affect invariably the network’s convergence towards equilibrium. The dynamic response of a learning network when the target itself changes with time can be studied in the information-theoretic plane.