ABSTRACT

This chapter describes the effects of some of the more basic schedules of reinforcement. In some observers' eyes, studying the effects of different schedules for their own sake has become the hallmark of an operant conditioner. With interval schedules of reinforcement, a maximum frequency of reinforcement are specified. Such schedules are defined in terms of fixed or variable intervals. A fixed interval schedule specifies that a fixed period of time must elapse from the delivery of one reinforcer to the availability of the next. After reinforcement, there is typically a pause, as with fixed ratio schedules; however, this is followed not by the sudden transition to high rates of responding seen with fixed ratio schedules, but by a slow and rather delicately progressive increase in response rate. With a variable interval schedule of reinforcement, an average minimum delay is specified between the delivery of reinforcement and the time at which the next becomes available to an operant response.