The presentation of an incentive generates a small amount of arousal that decays exponentially over time. If the time interval separating successive incentives is short enough, arousal cumulates to an equilibrium level that is predictable from decay constants derived from the presentation of isolated incentives. The accumulation of arousal accounts for the "excessive" nature of scheduleinduced, or adjunctive, behaviors.These are revolutionary times for behavior scientists. At least three major lines of research -adjunctive behaviors (Falk, 1972), speciesspecific constraints on learning (Hinde & Stevenson-Hinde, 1973;Seligman & Hager, 1972), and sign tracking (Hearst & Jenkins, 1974;Schwartz & Gamzu, 1977)-have developed over the last decade in response to behavioral anomalies, with proponents of each noting the inconsistency of their subdiscipline with traditional theories of learning (cf. Bolles, 1976). The present article concerns adjunctive, or scheduled-induced, behavior and attempts to provide a rational basis for those "existentially absurd" (Falk, 1972) responses. Falk (1961 was the first to demonstrate that intermittent food schedules induce excessive drinking (polydipsia). He found that rats consumed over half their body weight in water over a 3-hour period, whereas they normally consume less than 10% over a 24-hour period.Since Falk's initial demonstration of polydipsia, a host of other schedule-induced behaviors have been identified (for review, see Falk, 1972;Staddon, 1977;Wallace & Singer, 1976). Many explanatory hypotheses, such as superstitious conditioning, have been advanced and subsequently eliminated. The most attractive rationale has been provided by Staddon
Anima1s will perform an operant response to obtain food when abundant free food is availab1e. These data have implications for current 1earning theories, especial1y in terms of the motivationa1 variables associated with such behavior. The present paper reviews the literature and provides an analysis that suggests that responding for food in the presence of free food is importantly controlled by stimulus change attendant upon response-dependent food presentation. This apparent stimulus-reinforcer effect on behavior is compared to that observed in other areas of anima1 1earning research that include preference between schedu1es of responsedependent and response-independent reinforcement, preference between schedu1es of signa1ed and unsigna1ed reinforcement, autoshaping and automaintenance, and self-reinforcement in anima1s.When an animal is provided a choice between making an operant response to obtain food or by eating it from a source of continuously available free food, a curious thing happens. Animals not only acquire and continue to make the operant response, but they often appear to "prefer" the responsedependent food. That animals respond for food in the presence of identical free food is by now a wellestablished finding in the experimentalliterature.
The effects of varying amounts of reinforcement on the rate and temporal patterns of interim and terminal behaviors engendered by periodic feeding schedules were assessed in three experiments. Although the experiments differed in terms of species, response-reinforcement dependencies, and response measures, the results were consistent. Increases in amount of reinforcement appeared to have a general energizing effect on behavior. For pigeons, interim levels of general activity and terminal rates of key pecking increased uniformly with increases in the duration of reinforcement (Experiments 1 and 2). For rats, nondirected interim activity and terminal activity directed at the food source reliably increased with increases in the magnitude of reinforcement (Experiment 3). The temporal patterns of interim activities were well represented by a simple mathematical model, and there was a correspondence in the parameter changes o"f the model as amount of reinforcement was varied. Arousal, as indexed i by the predicted asymptotic levels of activity, was shown to be an increasing linear function of amount of reinforcement for both rats and pigeons. The present data challenge previous research that suggests that amount of reinforcement has little or no effect on behavior and show that average-response-rate measures fail to accommodate the nonlinear changes in responding that result when amount of reinforcement is varied.Reliable and systematic relationships have investigations showed that the rate at which been found between rate of responding and rats pressed a lever or ran an alleyway was frequency and delay of reinforcement, but an increasing function of amount of reininconclusive and apparently inconsistent re-forcement (Crespi, 1942;Guttman, 1953; suits have been obtained when the magnitude Zeaman, 1949). However, other research has of reinforcement has been varied. Some early shown either that changes in amount of reinforcement have little effect on rate of respending (Catania, 1963; Keesey & Kling, This report is based on a dissertation sub-1961 Schr j e r, 1962) or that response rate mitted to Arizona State University
Rats bar pressed for food on a reinforcement schedule in which every response was reinforced, even though a dish of pellets was present. Initially, auditory and visual stimuli accompanied response-produced food presentation. With stimulus feedback as an added consequence of bar pressing, responding was maintained in the presence of free food; without stimulus feedback, responding decreased to a low level. Auditory feedback maintained slightly more responding than did visual feedback, and both together maintained more responding than did either separately. Almost no responding occurred when the only consequence of bar pressing was stimulus feedback. The data indicated conditioned and sensory reinforcement effects of response-produced stimulus feedback.
Pigeons were exposed to a two-component multiple fixed-ratio X fixed-ratio Y schedule of reinforcement in which X was always less than Y. Components were equal in duration and alternated at rates that varied between 2 sec and 23.6 h. Relative response rate in the FR X component: (1) increased as the duration of components increased between 2 sec and 16 min, (2) was at a maximum between 16 min and 6 h, and (3) decreased as the duration of components increased from 6 h to 23.6 h. The changes in relative response rate were attributable primarily to changes in absolute response rates during the FR Y schedule as absolute response rates during the FR X schedule were relatively invariant. These results pose complexities for several theoretical formulations.In a multiple (mult) schedule, two or more independent reinforcement schedules are alternated and each of the component schedules is accompanied by a different stimulus. Although the four most common simple schedules, fixed interval (FI), fixed ratio (FR), variable interval (VI), and variable ratio (VR), have all been studied as components of mult schedules, VI schedules have received the most attention. The comparison of different schedule combinations, although important in its own right, is also relevant to several attempts to integrate data from a wide range of experimental procedures (Catania, 1973;Herrnstein, 1970Herrnstein, , 1979Rachlin, 1973;Shimp, 1969Shimp, , 1973. Most of the data cited by these investigators, in support of their respective positions, come from experiments employing interval, rather than ratio, schedules.Data from ratio schedules would appear to permit an assessment of the power and generality of these formulations. It is easy to see, however, why data from mult or concurrent (cone) ratio schedules are seldom included in these analyses. In the first place, the few studies that have examined responding on mu/t FR FR schedules may not be comparable to those involving mu/t VI VI schedules. For example, the transition from one component to the next in a mu/t VI VI has been response independent (Lander & Irwin, 1968;Nevin & Shettleworth, 1966;Reynolds, 1963;Shimp & Wheatley, 1971;Todorov, 1972); that is, the components of a two-component mu/t schedule alternate at some specified interval, independently of the animal's behavior. With mult 51FR FR, however, the transition has been response dependent, since the components typically alternate after some specified number of reinforcements (Crossman, 1968(Crossman, , 1971Findley, 1962; Schuster, 19S9). More importantly, the relation between responding and reinforcement rate, of central interest on interval schedules, is not free to vary on ratio schedules. However, experiments by Shimp and Wheatley (1971) and Todorov (1972) with mu/t VI VI schedules suggest a promising procedure for studying behavior maintained by mult FR FR reinforcement schedules and for meaningfully comparing such behavior with that reported in the mult VI VI case. These studies examined the rates of responding on a mu/t VI VI ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.