ABSTRACT

Much of B. F. Skinner’s empirical research demonstrated the effects of different intermittent schedules of reinforcement on the cumulative response patterns of pigeons and rats (cf., Ferster & Skinner, 1957). There are an infinite number of possible intermittent schedules between the extremes of zero responses and the reinforcement of all responses. How can we organize the possibilities in a meaningful way? Skinner developed a useful schema for their categorization based on

two considerations. The most fundamental distinction was between schedules requiring that a certain number of responses be completed (ratio schedules) and those requiring only a single response when the opportunity was presented (interval schedules). I often ask my classes if they can provide “real-life” examples of ratio and interval schedules. One astute ROTC student suggested that officers try to get the students to believe that promotions occur according to ratio schedules (i.e., how often you do the right thing), but, in reality, they occur on the basis of interval schedules (i.e., doing the right thing when an officer happened to be observing or found out about it). Working on a commission basis is a ratio contingency. The more items you sell, the more money you make. Calling one’s friend is an interval contingency. It does not matter how often you try if the person is not home. Only one call is necessary if the friend is available. The other distinction

Skinner made is based on whether or not the response requirement (in ratio schedules) or time requirement (in interval schedules) is constant. Contingencies based on constants were called “fixed,” and those for which the requirements changed were called “variable.” These two distinctions define the four basic reinforcement schedules (see Figure 9.5): fixed ratio (FR), variable ratio (VR), fixed interval (FI), and variable interval (VI).