ABSTRACT

How does the brain reliably represent the timing of visual information in a dynamic environment? Because the features defining an object are processed at different rates, different feature attributes belonging to the same object may not be processed simultaneously in cortex. In other words, visual information may get desynchronized. However, if a population code about the timing of visual information is used, then this problem can be avoided. To use a population code for timing it is necessary to define temporal frames that localize the neural activities in time. Perceptual framing is a process that resynchronizes object features by the synchronization of cortical activities corresponding to the same visual object, and thus it helps establishing a population code. A neural network model is presented in which desynchronized neural activities can rapidly be resynchronized. The model shows how psychophysical data can be explained through neural mechanisms. Simulations of the model quantitatively explain perceptual framing data, including psychophysical data about temporal order judgments. The properties of the model arise when fast long-range cooperation and slow short-range competition interact via nonlinear feedback with cells that obey shunting equations.