ABSTRACT

In recognition paradigms, increasing the number of occurrences or presentation time in a study list of some words improves performance on these words (the item strength effect), but does not affect the performance on other words (null list strength effect). In contrast, adding new items results in a deterioration of performance on the other words (list length effect). Taken together these results place strong constraints on models of recognition memory. To explain these data an account based on optimisation to the environment is presented. A summary is given of environmental analyses which suggest that (1) the likelihood of recurrence of a word within a context increases as the number of occurrences increases; (2) the repetition rates of other words in a context has no significant effect on the recurrence probability of a word; and (3) the recurrence probability of a word drops as a function of the number of words since the last occurrence of that word. A training set which reflected these constraints was constructed and presented to an optimising connectionist network which was designed to extract recurrence statistics (the Hebbian Recurrent Network). The resultant model is able to model all three of the effects outlined above.