Chapter 7: Schedules of Reinforcement
F1 4
Fixed interval schedule. 4 minutes must pass before pecking is reinforced. Pecks made before 4 min are not reinforced. Animals learn to wait 4 minutes before responding
John spent his summer picking cantaloupes for a farmer. The farmer paid John a certain amount for every basket of cantaloupes picked. John worked on a _________.
fixed ratio schedule
A teacher has a student who gives up at the first sign of difficulty. How can the teacher increase the child's persistence
. Answers should make use of stretching the ratio.
In one form of the matching law, BA stands for the behavior under consideration and B0 represents _______.
. all behaviors other than BA
In CRF, the ratio of reinforcers to responses is 1 to 1; in FR 1, the ratio is
1 to 1
Discrimination hypothesis
A hypothesis about PRE that says extinction takes longer because it is harder to distinguish between extinction and an intermittent schedule than it is to distinguish between extinction and CRF
stretching the ratio
A procedure in which the number of responses required for a reinforcement increase Ex - from CRF to FR3, FR5, FR 8, FR12, and so on. Can also be applied to other kinds of schedules.
Explain why fatigue is not a good explanation for postreinforcement pauses
Answers should note that more demanding (fatiguing) schedules do not necessarily produce longer pauses than less demanding schedules. Students might also argue that the fatigue explanation is circular.
An FR1 schedule is also called _____.
CRF/continuous reinforcement
Fifteen-year-old David gives up easily in the face of frustration. How could you dvlp his persistence?
David should be rewarded on an intermittent schedule and learn that continuing a behavior even while frustrated can result in reinforcement.
Ratio strain
Disruption of the pattern of responding due to stretching the ratio of reinforcement too abruptly or too much. The same concept applies to interval schedules.
The schedule that is likely to produce a cumulative record with scallops is the _________. a. FR schedule b. VR schedule c. FI schedule d. VI schedule
FI schedule
True/ False. . In VI schedules, the reinforcer occurs periodically regardless of what the organism does.
False
True/ False. In a multiple schedule, the organism is forced to choose between two or more reinforcement schedules.
False
True/False. . Although important, the matching law is restricted to a narrow range of species, responses, reinforcers, and reinforcement schedules
False
In a tandem schedule, behavior is performed during a series of schedules, but food reinforcement comes only at the end of the last schedule. What reinforces behavior during the earlier schedules when food is not provided
It could be argued that the food reinforcement reinforces all performances. However, students should mention that each change from one schedule to the next brings the subject closer to food reinforcement and may therefore be reinforcing
concurrent schedule
Kind of compound reinforcement schedule in which 2+ schedules are available at once. Involves a choice. A pigeon can peck a red disk on a VR50 schedule or a yellow disk on a VR20 schedule.
VI 2 min
Mean 2 min but elapsed time between trials is different. During an oil change you're given an estimate of 30 minutes. Sometimes it takes this amount, sometimes it does not.
A reduction in response rate following reinforcement is called a _________.
Post reinforcement pause
extinction
Previously reinforced behavior is no longer followed by reinforcers. Is like FR needing infinite behavior. Can take a long time. One extinction trial is not equal to one reinforcement trial.
___________ schedules differ from other schedules in that the rules describing the contingencies change systematically.
Progressive
Pause-and-run pattern of behavior
Quick burst of response followed by a pause
run rate
Rate at which behavior occurs once it was resumed following reinforcement.
Variable ratio (VR) schedule
Schedule where reinforcement occurs on variable around some average. Number of responses differs from trial to trial. Unpredictable, cannot guess when reward is delivered
VR Performance
Steady performance, run rates comparable to FR schedules. Post Reinforcement pauses may occur, appear less frequent and shorter than FR schedules. Pauses influenced by the size of the average ratio and by the lowest ratio. i.e. a requirement of 40 presses before reinforcement will produce a pause compared to requiring 10.
A rat's lever pressing is on a concurrent VI 5" VI 15" schedule. Describe the rat's behavior
Students should indicate that for every lever press on the VI 15" schedule, there will be about three responses on the VI 5" schedule.
Post reinforcement pauses
The pauses that follow reinforcement. Different schedules produce different pauses. Pauses reduce the total amount of reinforcement the animal gets
Fixed interval (FI) schedule
Time is constant from one reinforcer to the next. It is predictable. Instrumental responses are required for reinforcement delivery.
Extinction often increases the variability of behavior.
True
The thinner of two schedules, VR 5 and VR 10, is VR 10.
True
True/ False Extinction often increases the frequency of emotional behavior
True
True/ False Harlan Lane and Paul Shinkman put a college student's behavior on extinction following VI reinforcement . The student performed the behavior 8,000 times without reinforcement.
True
True/ False The more effort a behavior requires, the fewer times the behavior will be performed during extinction.
True
True/ False When a response is placed on extinction, there is often an increase in emotional behavior
True
True/ False. One difference between FT and FI schedules is that in FT schedules, reinforcement is not contingent on a behavior.
True
True/ False. The response unit hypothesis suggests that there really is no such thing as the partial reinforcement effect
True
True/ False. When food is the reinforcer, it is possible to stretch the ratio to the point at which an animal expends more energy than it receives.
True
True/False. One effect of the extinction procedure is an increase in the variability of behavior
True
True/False. One everyday example of a VR schedule is the lottery
True
extinction burst
abrupt increase in behavior that occurs immediately when trying to extinguish it. When a child throws a tantrum but the parent doesn't give them what they want, so the child cries even harder.
Pre ratio pauses
another name for post-reinforcement pauses
After a reinforcement, the rate of the reinforced behavior may fall to or near zero before increasing again. The period during which the behavior occurs infrequently is called a _____ pause.
between-ratio
__________ refers to the point at which a behavior stops or its rate falls off sharply.
break point
Often the initial effect of an extinction procedure is an increase in the behavior called a(n) extinction ________.
burst
The immediate effect of extinction is often an abrupt increase in the rate of the behavior being extinguished.This is called an extinction ______.
burst
Choice involves ____ schedules.
concurrent
Stanley wants to determine which of two reinforcement schedules is more attractive to rats. He trains a rat to press a lever for food, and then puts the rat into an experimental chamber containing two levers. Pressing one lever produces reinforcement on an FR 10 schedule; pressing the other lever produces reinforcement on an FI 10" schedule. Lever pressing is on a _________.
concurrent schedule
Studies of choice involve _________.
concurrent schedules
CRF stands for ________
continuous reinforcement
If reinforcement is contingent on the behavior of more than one individuals, a ______ schedule is in effect.
cooperative
A schedule in which reinforcement is contingent on the behavior of more than one subject is a _________.
cooperative schedule
2 types of interval schedules
defined as reinforcement is delivered only if responses are made after a certain amount of time has passed. types are fixed and variable interval.
According to the ________ hypothesis, the PRE occurs because it is difficult to distinguish between intermittent reinforcement and extinction
discrimination
The frustration and sequential hypotheses are both variations of the ______ hypothesis.
discrimination
In a _____ schedule, reinforcement is contingent on the continuous performance of a behavior for some period of time.
fixed duration
The explanation of the PRE that puts greatest emphasis on internal cues is the ________ hypothesis.
frustration
Shirley trains a rat to press a lever and then reinforces lever presses on an FR 10 schedule when a red light is on, and an FI 10" schedule when a green light is on. In this case, lever pressing is on a
multiple schedule
FT and VT are both kinds of ______ reinforcement.
noncontingent. since the reinforcers are not contingent on a behavior, that term seems inappropriate.
in VR the number of responses required is
not predictable
When behavior is on a FR schedule, animals often discontinue working briefly following reinforcement. These periods are called ________.
post-reinforcement pauses/pre-ratio pauses/between-ratio pauses
Things are going pretty well for George (see item 26) until he jumps from reinforcing every tenth response to reinforcing every 50th response. At this point, the pigeon responds erratically and nearly stops responding entirely. George's pigeon is suffering from _________.
ratio strain
The study of reinforcement schedules suggests that the behavior we call stick-to-itiveness is largely the product of _________.
reinforcement history
One explanation for the PRE implies that the effect is really an illusion. This is the _________.
response unit hypothesis
The reappearance of previously effective behavior during extinction is called ____________.
resurgence
The reappearance, during extinction, of a previously effective beh is called _____.
resurgence
The rule describing the delivery of reinforcement is called a ________of reinforcement.
schedule
The term ________ refers to the pattern and rate of performance produced by a particular reinforcement schedule
schedule effects
. If you increase the requirements for reinforcement too quickly you are likely to see evidence of ratio _____
strain
George trains a pigeon to peck a disk by reinforcing each disk peck. Once the response is learned, George begins to cut back on the reinforcers. At first he reinforces every other response, then every third response, every fifth response, every tenth response, and so on. George is using a procedure called _________.
stretching the ratio
Gradually reducing the frequency of reinforcement is called _________.
stretching the ratio
A chain schedule is most like a _________ schedule.
tandem
Variable interval (VI) schedule
A response is reinforced when a variable amount of time has elapsed since the last reinforcer. Unpredictable, rate of responding is steady.
Schedules of Reinforcement
A rule describing the delivery of reinforcers for a behavior. Depends on number of responses, passage of time, stimuli, and occurrence of other responses
Continuous reinforcement (CRF)
A simple reinforcement schedule in which behavior is reinforced each time it occurs. Reinforcement is delivered for each response. Responses are steady. Not practical for real world, you cannot be rewarded for everything you do right
Frustration hypothesis
Amsel A hypothesis about PRE that says nonreinforcement of previously reinforced behavior is frustrating, and because individuals become frustrated during intermittent schedules when they are not reinforced, performing while frustrated is reinforced.
John wants to teach Cindy, age 5, the alphabet. He plans to reinforce correct performances with praise and small pieces of candy. What sort of schedule should he use?
At first, John should use continuous reinforcement. As Cindy learns more, he should move to an intermittent schedule, such as VI.
The schedule to use if you want to produce the most rapid learning of new behavior is _______.
CRF. Continuous reinforcement
Sequential hypothesis
Capaldi A hypothesis about PRE that says differences in sequence of cues create PRE. Each performance is followed by reinforcement or nonreinforcement. In intermittent schedules, there is both reinforcement and nonreinforcement. Therefore, reinforcement and nonreinforcement are both signals for performance.
CRF is synonymous with _________. a. EXT b. FR 1 c. CRT d. FI 1
FR 1
The schedule that is not an intermittent schedule is _________. a. FR 1 b. FR 5 c. VR 1 d. VI 1"
FR 1. Fixed Ratio.
Four main types of reinforcement
FR, fixed ratio two sub types of reinforcement continuous partial or intermittent. VR, variable ratio. FI, fixed interval. VI, variable interval.
A schedule that does not require the performance of a particular behavior is the _________. a. FT schedule b. FD schedule c. FI schedule d. FR schedule
FT
Of the following, the schedule that most closely resembles noncontingent reinforcement is _________. a. FD b. FT c. FI d. DRL
FT
One difference between FT and FI schedules is that in the _____ schedule, reinforcement is not contingent on a behavior
FT
Matching Law
Herrnstein's law that two beh, B1 and B2, with reinforcement schedules r1 and r2, the frequency of each beh = the frequency of reinforcement available.
In the days of public telephones, many people regularly checked the coin return after making a call. Explain this behavior.
If a person looked for coins, their behavior may have been reinforced by receiving coins. Each time they went to the telephone, they performed this behavior and sometimes were reinforced again.
Variable ratio (VR) schedule example
In a VR 5 schedule schedule lever presses for reinforcement might occur after 1-10 responses but the average is every 5 presses.
Multiple schedule
Kind of compound reinforcement schedule in which behavior is under the influence of 2+ simple schedules, each associated w/ a particular stimulus. A pigeon is on a FI10" schedule when yellow light is on and VR10 when red light is on - called a MULT FI10" VR10 schedule
Mixed schedule
Kind of compound reinforcement schedule in which behavior is under the influence of 2+ simple schedules.
chain schedule
Kind of compound reinforcement schedule in which reinforcement is delivered only on completion of the last in a series of schedules, and each new schedule is marked by a distinctive event. A pigeon on a CHAIN FR10 FI15" VR20 schedule pecks a red disk and after the 10th peck, the disk turns yellow. After 15sec, the next peck will result in the disk turning green. After an avg of 20 pecks, the pigeon gets reinforcement. What reinforces this? Possibly the stimuli.
tandem schedule
Kind of compound reinforcement schedule in which reinforcement is delivered only on completion of the last in a series of schedules. What reinforces this? Possibly the change in schedule.
cooperative schedule
Kind of compound reinforcement schedule in which reinforcement is dependent on the behavior of 2+ individual. So, two pigeons must peck 20x in total. Or, they can peck 20x in total, and each must peck at least 10x. Typically inefficient in humans - think about the one slacker on group projects.
Progressive ratio (PR) schedule
Kind of progressive reinforcement schedule in which the requirement for a reinforcer increase in a predetermined way, often immediately after a reinforcement. The ratio gets larger in an arithmetic or geometric way, or the reinforcement gets smaller/decrease in quality. This happens continuously until the break point.
Intermittent schedule
Kind of simple reinforcement schedule in which behavior is reinforced on some occasions, but not others.
Fixed ratio (FR) schedule
Kind of simple reinforcement schedule in which behavior is reinforced when it occurs a fixed number of times. Continuous, partial or intermittent reinforcement. Predictable. 5 responses for 5 minutes.
Fixed duration (FD) schedule
Kind of simple reinforcement schedule in which reinforcement is contingent on the continuous performance of the behavior for some period of time. For ex, a child practices piano for 30 min and gets a reward.
noncontingent reinforcement (NCR) schedule
Kind of simple reinforcement schedule in which reinforcers are not contingent on behavior. Includes fixed time and variable time.
Fixed time (FT) schedule
Kind of simple reinforcement schedule in which the reinforcer is delivered after a given period of time regardless of behavior. Uncommon outside of laboratory, but some might make a case that welfare programs are similar.
Variable time (VT) schedule
Kind of simple reinforcement schedule in which the reinforcer is delivered periodically at irregular intervals around some avg regardless of what behavior occurs.
Variable duration (VD) schedule
Kind of simple reinforcement schedule in which the required performance to receive reinforcement varies around some avg. For ex, a child has to practice piano for 30, 45, 10, or 20 min - an avg of 30 min - to get reward.
Progressive schedule
Kind of simple reinforcement schedule in which the requirements to receive reinforcement incr systematically. Is complicated for a simple reinforcement schedule.
Response unit hypothesis
Mowrer and Jones A hypothesis about PRE that says in intermittent schedules, like FR2 schedule for reinforcing lever pressing, the behavior that is reinforced is actually two lever presses. So, PRE is an illusion: In the rat experiment with lever pressing, CRF: 128 presses before extinction. FR2: 94 presses before extinction. (188/2) FR3: 71.8 presses before extinction. (215.5/3) FR4: 68.1 presses before extinction. (272.3/4)
Is gambling a form of superstitious behavior?
No, because gambling can result in reinforcement.
resurgence
Reappearance of a previously reinforced behavior This may explain regression.
The rate at which a response occurs, once the subject begins performing it, is called the _________.
Run Rate
Schedule effects
The distinctive rate and pattern of behavior associated with a particular reinforcement schedule. Important because learning can also be a change in behavior.
Mary complains that her dog jumps on her when she gets home from school. You explain that she reinforces this beh by petting and talking to the dog when it jumps up, but Mary replies that you must be wrong because she hardly ever does this. How do you respond?
The dog is being reinforced on an intermittent schedule, so it is more difficult to extinguish that behavior. The dog will continue to perform this behavior until it receives reinforcement.
What is the matching law?
The rate of behavior matches the rate of reinforcement.
A teacher reinforces longer and longer periods of quiet behavior in her students. How can she avoid creating ratio strain?
The teacher can start out with small amounts of quiet behavior, and gradually increase it - perhaps by five minutes each class.
partial reinforcement effect (PRE)
The tendency of a behavior that has been maintained on an intermittent schedule to be more resistant to extinction than behavior on CRF The thinner the reinforcement schedule, the greater the amount of lever presses before extinction.
. Your text reports the case of a man who apparently made hundreds of harassing phone calls. The man's behavior was most likley on a(n) _________.
VR schedule
The thinner of two schedules, FR3 and VR4, is ____.
VR4
Harry spent his summer in the city panhandling. Every day he would sit on the sidewalk, put a cardboard sign in front of him that said, "Please help," and place his hat on the sidewalk upside down. Then he would wait. Every now and then someone would put money into his hat. Harry's reinforcement schedule is best described as a _________.
VT, variable time schedule
The tendencies to respond eventually correspond to the probabilities of ______.
reinforcement
How might you use what you know about reinforcement schedules to study the effects of the presence of observers on human performance?
When a person's performance is observed, they may behave in a certain way in an attempt to get a reward because previously, observers have provided reinforcement for behaviors.
break point
When the behavior falls off sharply or stops entirely.
How could you "stretch the ratio" when a beh is on an interval schedule?
You could do something analogous to stretching the ratio by "stretching the interval."
When reinforcement is contingent on continuous performance of an activity, a __________. reinforcement schedule is in force
duration
A given reinforcement schedule tends to produce a distinctive pattern and rate of performance. These are called schedule _______.
effects
. Williams found that the greater the number of reinforcements before extinction, the _______.
greater the number of responses during extinction
ratio run
high and steady rate of responding that completes each ratio requirement
In VR a post reinforcement pause can show but
it is generally smaller than in FR schedules
The _____ law means that, given a choice of activities, the proportion of responses to each activity will reflect the availability of reinforcement for each.
matching
The difference btwn multiple and mixed schedules is that in _____ schedules there is a signal that has changed.
multiple
Extinction is the opposite of _____.
reinforcement
PRE stands for ___________
partial reinforcement effect
The more work required for each reinforcement the longer the
post reinforcement pause.
. Post-reinforcement pauses are now often referred to as _________.
pre ratio pauses
Thinning a reinforcement schedule too rapidly or too much can produce _______.
ratio strain
In a(n) ____ schedule, reinforcement is contingent on the number of times a beh occurs; in a(n) _____ schedule, reinforcement is contingent on the beh occurring after a given period since the last reinforcement.
ratio, interval
Resurgence may help account for _______.
regression
partial (intermittent) reinforcement
reinforcing a response only part of the time; results in slower acquisition of a response but much greater resistance to extinction than does continuous reinforcement. On a FR 10 schedule, every 10th response, they would be rewarded.
In VR there is a steadier state of
responding that FR schedules
Of the four explanations of the PRE, the one that essentially says there is no such thing is the _______ hypothesis.
response unit
The rate at which a beh occurs once it has begun is called the ____ rate.
run
VR reinforcement is dependent on
the number of responses
A pigeon is confronted with two disks, one green, the other red. The bird receives food on a VI 20" schedule when it pecks the green disk, and on a VI 10" schedule when it pecks the red one. You predict that the bird will peck _________.
the red disk about twice as often as the green disk
In FT and VT schedules, reinforcement is contingent on ____ rather than _____.
time, behavior
. In schedules research, VD stands for ________.
variable duration
Bill spends his summer in the city panhandling. Every day he takes a position on a busy corner and accosts passersby saying, "Can you spare some change?" Most people ignore him, but every now and then someone gives him money. Bill's reinforcement schedule is best described as a _________.
variable ratio schedule