Four experiments were conducted with pigeons to assess the experimental conditions necessary for the occurrence of resurgence. The general procedure consisted of the following conditions: Condition 1-reinforcement of key pecking; Condition 2-reinforcement of treadle pressing and concurrent extinction of key pecking; and Condition 3-the resurgence condition wherein resurgence was defined as the recovery of key pecking. In Experiments 1 and 2, the resurgence condition was conventional extinction. The effect of recency on resurgence magnitude was examined in Experiment 1 by manipulating the number of sessions of Condition 2, above. Resurgence was not a function of recency with the parameters used. Repeating the three conditions revealed resurgence to be a repeatable effect in Experiment 2. In Experiment 3, a variable-time schedule was in effect for the resurgence condition. Resurgence was not produced by response-independent food delivery. In Experiment 4, the resurgence condition was a variable-interval schedule for treadle pressing that arranged a lower reinforcement rate than in Condition 2 (92% reduction in reinforcers per minute). Resurgence was lower in magnitude relative to conventional extinction, although resurgence was obtained with 2 out of 3 pigeons. The results are discussed in terms of the variables controlling resurgence and the relations between behavioral history, resurgence, and other forms of response recovery.
Pigeons were exposed to two different reinforcement schedules under different stimulus conditions in each of two daily sessions separated by 6 hr (Experiments 1 and 2) or in a single session (Experiment 3). Following this, either a fixed-interval (Experiment 1) or a variable-interval schedule (Experiments 2 and 3) was effected in both stimulus conditions. In the first two experiments, exposure to fixed-ratio or differential-reinforcement-of-low-rate schedules led to response-rate, but not pattern, differences in subsequent performance on fixed- or variable-interval schedules that persisted for up to 60 sessions. The effects of reinforcement-schedule history on fixed-interval schedule performance generally were more persistent. In Experiment 3, a history of high and low response rates in different components of a multiple schedule resulted in subsequent response-rate differences under identical variable-interval schedules. Higher response rates initially occurred in the component previously correlated with high response rates. For 3 of 4 subjects, the differences persisted for 20 or more sessions. Previous demonstrations of behavioral history effects have been confined largely to between-subject comparisons. By contrast, the present results demonstrate strong behavioral effects of schedule histories under stimulus control within individual subjects.
Discrete responses of experimentally naive, food-deprived White Carneaux pigeons (key pecks) or Sprague-Dawley rats (bar or omnidirectional lever presses) initiated unsignaled delay periods that terminated with food delivery. Each subject first was trained to eat from the food source, but no attempt was made to shape or to otherwise train the response. In both species, the response developed and was maintained. Control procedures excluded the simple passage of time, response elicitation or induction by food presentation, type of operandum, food delivery device location, and adventitious immediate reinforcement of responding as the basis for the effects. Results revealed that neither training nor immediate reinforcement is necessary to establish new behavior. The conditions that give rise to both the first and second response are discussed, and the results are related to other studies of the delay of reinforcement and to explanations of behavior based on contingency or correlation and contiguity.
Three pigeons responded on several tandem variable-interval fixed-time schedules in which the value of the fixed-time component was varied to assess the effects of different unsignalled delays of reinforcement. Actual (obtained) delays between the last key peck in an interval and reinforcement were consistently shorter than the nominal (programmed) delay. When nominal delays were relatively short, response rates were higher during the delay condition than during the corresponding nondelay condition. At longer nominal delay intervals, response rates decreased monotonically with increasing delays. The results were consistent with those obtained from delay-of-reinforcement procedures that impose either a stimulus change (signal) or a no-response requirement during the delay interval.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.