ABSTRACT

The input of U as a rectangular probability distribution to a Γ variant does not necessarily result in an output Yobs which is also rectangular; almost always it is not so. Hence the output entropy is less than the input entropy, because the input entropy is maximum by definition. As the behaviour of a system under maximum uncertainty is a limiting case of both theoretical and empirical interest, it is a starting point for simulations of very wide generality. A gaussian distribution of inputs is also, of course, a special case which has been espoused in Signal Detection Theory, but here a rectangular distribution has been most often used for three reasons: (i) it is easy to create with standard programming languages, (ii) as it is maximum entropy it creates the widest range of input conditions equiprobably and thus imposes a maximal load on the system as an adaptive process model, and (iii) it comes close to what is common laboratory practice in psychophysical experiments, and hence any comparison between the maximum entropy case and one of less entropy is indirectly a test of the ecological validity of the laboratory situation, and a prediction of how behaviour will differ in moving from the laboratory to the field. We can only study how a nonlinear system degrades entropy as a function of the system parameters {a, e, η} if the input is not already in some imperfectly understood form locally degraded.