phy chapter three test
ratio strain, aka burn out.
"stretching the ratio"—moving from a low ratio requirement (a dense schedule) to a high ratio requirement (a lean schedule)—should be done gradually. For example, once lever pressing is well established on a CRF schedule, the requirement can be gradually increased to FR 2, FR 5, FR 10, and so on. If the requirement is increased too quickly—for example, CRF to FR 2 and then a sudden jump to FR 20—the rat's behavior may become erratic and even die out altogether. Likewise, if you try to raise the requirement too high—say, to FR 2000—there may be a similar breakdown in the rat's behavior. Ratio strain is what most people would refer to as burnout, and it can be a big problem for students faced with a heavy workload.
A schedule in which 15 responses are required for each reinforcer is abbreviated___ .
(FR)
backward chaining def
An efficient way to establish responding on a chained schedule is to train the final link first and the initial link last, Using the pigeon example, the pigeon would first be trained to respond on the red key to obtain food. This will establish the red key as a secondary reinforcer through its association with food. As a result, the presentation of the red key can then be used to reinforce responding on the green key. Once this is established, the presentation of the green key can be used to reinforce responding on the white key. In these examples, each link in the chain required the same type of behav- ior; namely, key pecking. It is also possible to create behavior chains in which each link consists of a different behavior.
An FR 1 schedule of reinforcement can also be called a
CRF) continuous reinforcement schedule
On a c____ reinforcement schedule (abbreviated),____ each response is reinforced, whereas on an i________reinforcement schedule, only some responses are reinforced. The latter is also called a p________ reinforcement schedule.
CRF, intermittent/ partial.
Each time you flick the light switch, the light comes on. The behavior of flicking the light switch is on a(n)_____ schedule of reinforcement.
CRF/ continuous
list the Schedules of Reinforcement
Continuous Reinforcement schedule, Intermittent or partial reinforcement schedule, Fixed Ratio schedule, Variable Ratio Schedules, Fixed Interval schedules, Variable interval schedules,
On a video game, the faster you destroy all the targets, the more bonus points you obtain. This is an example of reinforcement of behavior (abbreviated
DRH differential reinforcement of high rate of responding.
In practicing the slow-motion form of exercise known as tai chi, Tung noticed that the more slowly he moved, the more thoroughly his muscles relaxed. This is an example of d reinforcement of behavior (abbreviated ).
DRL Differential reinforcement of low rates
On a (use the abbreviation)_______ schedule, a minimum amount of time must pass between each response before the reinforcer will be delivered. On a ____________schedule, reinforcement is contingent upon emitting at least a certain number of responses in a certain period of time. On a______________ schedule, reinforcement is contingent on emitting a series of responses at a specific rate.
DRL; DRH; DRP differential reinforcement of low rates (DRL). differential reinforcement of high rates differential reinforcement of paced responding (DRP).
Frank discovers that his golf shots are much more accurate when he swings the club with a nice, even rhythm that is neither too fast nor too slow. This is an example of reinforcement of behavior (abbreviated ).
DRP differential reinforcement for paced responding.
Ahmed's daily routine consists of swimming without rest for 30 minutes, following which he takes a break. This most closely resembles a(n) schedule of reinforcement.
FD
________ often produce a "_______________" (upwardly curved) pattern of responding, consisting of a P________ P__________ followed by a gradually increasing rate of response as the interval draws to a close.
FI schedule/ scalloped/ post reinforcement pause/ followed by an increasing rate of response as the interval draws to a close.
Eddy finds that he has to thump his old television set exactly twice before the picture will clear up. His behavior of thumping the television set is on a (be specific and use the abbreviation) schedule of reinforcement.
FR 2
A mother finds that she always has to make the same request three times before her child complies. The mother's behavior of making requests is on an______ schedule of reinforcement.
Fixed Ratio
Every morning at 7:00 a.m. a robin perches outside Marilyn's bedroom window and begins singing. Given that Marilyn very much enjoys the robin's song, this is an example of a 24-hour schedule of reinforcement (abbreviated ). this is an example of what?
Fixed Time schedule FT
example of a Variable ratio schedule
For example, (___ 5) schedule, a rat has to emit an average of 5 lever presses for each food pellet, with the number of lever responses on any particular trial varying between, say, 1 and 10. Thus, the number of required lever presses might be 3 for the first pellet, 6 for the second pellet, 1 for the third pellet, 7 for the fourth pellet, and so on, with the overall average being 5 lever presses for each reinforcer. Similarly, on a ___50 schedule, the number of required lever presses may vary between 1 and 100, with the average being 50.
examples of Differential reinforcement of paced responding
For example, a rat might receive a food pellet if it emits 10 consecutive responses, with each response separated by an interval of no less than 1.5 and no more than 2.5 seconds. Similarly, musical activities, such as playing in a band or dancing to music, require that the relevant actions be performed at a specific pace. People who are very good at this are said to have a good sense of timing or rhythm. Further examples of_____ schedules can be found in noncompetitive swimming or running.
example of variable time schedule
For example, on a variable time 30-second (VT 30-sec) schedule, a pigeon receives access to food after an average interval of 30 seconds, with the actual interval on any particular trial ranging from, say, 1 second to 60 seconds. Similarly, you may coincidentally run into an old high school chum about every 3 months on average (a VT 3-month schedule). VT schedules there- fore involve the delivery of a free reinforcer following an unpredictable period of time. (Q
Example of intermittent reinforcement schedule.
For example, perhaps only some of the rat's lever presses result in a food pellet, and perhaps only occasionally did your mother give you a cookie when you asked for one.
stretching the ratio"—moving from a low ratio requirement (a dense schedule) to a high ratio requirement (a lean schedule)—should be done how and what is an example?
Gradually/ once lever pressing is well established on a CRF schedule, the requirement can be gradually increased to FR 2, FR 5, FR 10, and so on. If the requirement is increased too quickly—for example, CRF to FR 2 and then a sudden jump to FR 20—the rat's behavior may become erratic and even die out altogether.
Herrnstein (1966) noted that superstitious behaviors can sometimes develop as by-products of contingent reinforcement for some other behavior. For example, a businessman might believe it is important to impress custo- mers with a firm handshake—when in fact it is merely the handshake, and not the firmness of the handshake, that is the critical factor. (Unfortunately, such a superstition could have serious consequences if the businessman then attempts to branch out into the Asian market, where a firm handshake is often regarded as a sign of disrespect.)
Herrnstein (1966) noted that superstitious behaviors can sometimes develop as by-products of contingent reinforcement for some other behavior. For example, a businessman might believe it is important to impress custo- mers with a firm handshake—when in fact it is merely the handshake, and not the firmness of the handshake, that is the critical factor. (Unfortunately, such a superstition could have serious consequences if the businessman then attempts to branch out into the Asian market, where a firm handshake is often regarded as a sign of disrespect.)
Example of a Variable interval schedule of reinforcement.
If you need to contact your professor with a last minute question about an assignment and know that she always arrives in her office sometime between 8:00 p.m. and 8:30 a.m., a good strategy would be to phone every few min- utes throughout that time period. By doing so, you will almost certainly con- tact her within a few minutes of her arrival.
Just as people on welfare sometimes become less inclined to look for work, the pigeon that receives free reinforcers will work less vigorously for the contingent reinforcers.
Just as people on welfare sometimes become less inclined to look for work, the pigeon that receives free reinforcers will work less vigorously for the contingent reinforcers.
ratio strain example
Likewise, if you try to raise the requirement too high—say, to FR 2000 form FR 20 —there may be a similar breakdown in the rat's behavior. Such breakdowns in behavior are technically known as? Some students, especially those who have a history of getting by with minimal work, may find it increasingly difficult to study under such circumstances and may even choose to drop out of college.
1987), for example, placed students in a booth that contained three levers and a counter. who did this experiment who did the pigion experiment
Orno and Skinner. respectively orno point system for lever pulls even though their was no real points and skinner pigeons got free food whenever they wanted. caused superstitious behiovr.
Shawna often goes for a walk through the woods, but she rarely does yardwork. According to the_______, walking through the woods could be used as a_______ for yardwork.
Premack principle; reinforcer
A very dense schedule of reinforcement can also be referred to as a very rich/ or lean
Rich.
Skinner's (1948b) original demonstration of supersti- tious behavior involved the use of a fixed time schedule,
Skinner's (1948b) original demonstration of supersti- tious behavior involved the use of a fixed time schedule,
real world is filled with examples of VR schedules.
Some predatory behaviors, such as that shown by cheetahs, have a strong ___ component in that only some attempts at chasing down prey are successful. humans, only some acts of politeness receive an acknowledgment, only some residents who are called upon by canvassers will make a contribution, and only some CDs that we buy are enjoyable. Many sports activities, such as shooting bas- kets in basketball and shots on goal in hockey, are also reinforced largely on a ____ schedule. A
Premack principle.
The notion that a high-probability behavior can be used to reinforce a low-probability behavior.
examples of DRH
The term differential reinforcement means simply that one type of response is reinforced while another is not. In a ______ schedule, reinforcement is provided for a high rate of response and not for a low rate. For example, a rat might receive a food pellet only if it emits at least 30 lever presses within a period of a minute. Similarly, a worker on an assembly line may be told that she can keep her job only if she assembles a minimum of 20 carburetors per hour. By requiring so many responses in a short period of time, ______ schedules ensure a high rate of responding. Athletic events such as running and swimming are prime examples of _________ schedules in that winning is directly contingent on a rapid series of responses.
As Tessa sits quietly, her mother occasionally gives her a hug as a reward. This is an example of ________
VD schdule of reinforcement.
On a (VD/VI) schedule, reinforcement is contingent upon responding continuously for a varying period of time; on an (FI/FD) schedule, reinforcement is contingent upon the first response after a fixed period of time.
VD/ FD
For farmers, rainfall is an example of a noncontingent reinforcer that is typically delivered on a_______schedule (abbreviated )
VT variable time schedule
Gambling is often maintained by a schedule __________of reinforcement.
Variable ratio
In fact, the behavior of a gambler playing a slot machine is the classic example of human behavior controlled by what kind of schedule/ / aberrant social behavior may also be accounted for by____ schedules. ______ develo ment of an abusive relationship.
Variable ratio schedule.
What happens if a noncontingent schedule of reinforcement is superim- posed on a regular, contingent schedule of reinforcement? What if, for example, a pigeon responding on a VI schedule of food reinforcement also receives extra reinforcers for free? Will the pigeon's rate of response on the VI schedule increase or decrease? In fact, the pigeon's rate of response on the response-dependent schedule will decrease One study, conducted several years ago, found that major league pitchers who had signed long-term contracts showed a sig- nificant decline in number of innings pitched relative to pitchers who only signed a 1-year contract ( decrease in behavior is the main take away)
What happens if a noncontingent schedule of reinforcement is superim- posed on a regular, contingent schedule of reinforcement? What if, for example, a pigeon responding on a VI schedule of food reinforcement also receives extra reinforcers for free? Will the pigeon's rate of response on the VI schedule increase or decrease? In fact, the pigeon's rate of response on the response-dependent schedule will decrease One study, conducted several years ago, found that major league pitchers who had signed long-term contracts showed a sig- nificant decline in number of innings pitched relative to pitchers who only signed a 1-year contract decrease in behavior is the main take away here.
bad/ how to determine a fixed interval breakdown example.
Would the behavior of trying to phone a business that opens in 30 min- utes also follow a scalloped pattern? If we have a watch available, it probably would not. We would simply look at our watch to determine when the 30 minutes have elapsed and then make our phone call. The indicated time would be a discriminative stimulus (SD) for when the reinforcer is available (i.e., the business is open), and we would wait until the appropriate time. before phoning. But what about the behavior of looking at your watch dur- ing the 30 minutes (the reinforcer for which would be noticing that the interval has elapsed)? You are unlikely to spend much time looking at your watch at the start of the interval. As time progresses, however, you will begin looking at it more and more frequently. In other words, your behavior will follow the typical scalloped pattern of responding
S e are the different effects on behavior produced by different response requirements. These are the stable patterns of behavior that emerge once the organism has had sufficient exposure to the schedule. Such stable patterns are known as st -st behaviors.
________/ steady state
ratio strain aka burnout.
a disruption in responding due to an overly demanding response requirement.
example of a FT fixed timed schedule
a fixed time 30-second (FT 30-sec) schedule, a pigeon receives access to food every 30 seconds regardless of its behavior.
differential reinforcement of low rates (DRL), def
a minimum amount of time must pass between each response before the reinforcer will be delivered—or, more generally, reinforcement is provided for responding at a slow rate.
VI schedules usually produce a (moderate/ slow?) (steady /erratic?) rate of response, often with (alot or a little/ no) post-reinforcement pause.
a moderate steady/ with little or no post reinforcement pause.
Human examples of DRL schedules
a person is required to perform an action slowly. For example, a parent might praise a child for brushing her teeth slowly or completing her homework slowly, given that going too fast generally results in sloppy performance. Once the quality of performance improves, reinforcement can then be made contingent on responding at a normal speed.
For example, of a DRL ( responce rate schedule)
a rat might receive a food pellet only if it waits at least 10 seconds between lever presses. So how is this different from an FI 10-sec schedule? Remember that on an FI schedule, responses that occur during the interval have no effect; on a DRL schedule, however, responses that occur during the interval do have an effect—an adverse effect in that they prevent reinforcement from occurring. In other words, responding during the interval must not occur in order for a response following the interval to produce a reinforcer.
example of a Fixed interval
a rat on an ____ 30-sec schedule will likely emit no lever presses at the start of the 30- second interval. This will be followed by a few tentative lever presses perhaps midway through the interval, with a gradually increasing rate of response thereafter. By the time the interval draws to a close and the reinforcer is imminent, the rat will be emitting a high rate of response, with the result that the reinforcer will be attained as soon as it becomes available.
A chained schedule consists of ____________
a sequence of two or more simple schedules, each of which has its own SD and the last of which results in a terminal reinforcer. In other words, the person or animal must work through a series of component schedules to obtain the sought-after reinforcer. Nevertheless, responding tends to be some- what weaker in the earlier links of a chain than in the later links. T
Russ is so impressed with how quickly his betta learned to swim in a circle that he keeps doubling the number of circles it has to perform in order to receive a reinforcer. This is an example of an ___________schedule of reinforcement (one that is particularly likely to suffer from ____ , ______
adjusting ratio strain
To the extent that a gymnast is trying to improve his performance, he is likely on a(n)_____ schedule of reinforcement; to the extent that his performance is judged according to both the form and quickness of his moves, he is on a(n)______ schedule.
adjusting schedule, conjective schedule
example of a VD schedule
behavior must be performed continuously for a varying, unpredictable period of time. For example, the rat must run in the wheel for an average of 60 seconds to earn one pellet of food, with the required time varying between 1 second and 120 seconds on any particular trial
Anna ideally likes to exercise for 1 hour each morning, followed by a 30-minute sauna, in turn followed by a half hour of drinking coffee and reading the newspaper. Unfortunately, due to other commitments, she actually spends 45 minutes exercising, followed by a 15-minute sauna, and a half hour drinking coffee and reading the paper. According to the _________approach, Anna's ideal schedule provides the________ amount of overall reinforcement that can be obtained from those activities. Her actual distribution of behavior represents her attempt to draw as near to the_____ point as possible for these activities.
behavioral bliss point; optimal (maximum); blis
The typical FR pattern is sometimes called a b______ -and-r _______pattern, with a pause that is followed immediately by a (high/ low) rate of response.
break and run post-reinforcement pause. low
Thus, Abraham Maslow (1971), another famous humanistic psychologist, argued what
child rearing should be neither too restrictive nor too lenient, which in behavioral terms can be taken to imply that the social reinforcement children receive should be neither exces- sively contingent nor excessively noncontingent
Herrnstein (1966) noted that superstitious behaviors can sometimes develop as a by-product of c reinforcement for some other behavior.
contigent reinforcement
A child who is often hugged during the course of the day, regardless of what he is doing, is in humanistic terms receiving unconditional positive regard. In behavioral terms, he is receiving a form of non____________ social reinforcement. As a result, this child may be (more/less) likely to act out in order to receive attention.
contingent and less
Fixed Ratio Schedules On a fixed ratio (FR) schedule, examples FR produce what rate of responce each pause in usually followed by what
contingent upon a fixed, predictable number of responses. FR 5), a rat has to press the lever 5 times to obtain a food pellet. On an FR 50 schedule, it has to press the lever 50 times to obtain a food pellet. Similarly, earning a dollar for every 10 carburetors assembled on an assembly line is an example of an FR 10 schedule, w FR schedules generally produce a high rate of response along with a short pause following the attainment of each reinforcer (see Figure 7.1). This short pause is known as a postreinforcement pause. each pause is usually followed by a relatively quick return to a high rate of response. T typical FR pattern is described as a "break- and-run" pattern—a short break followed by a steady run of responses. starting a task is often the most important step in overcoming procrastination; once you start, the work often flows naturally in general, higher ratio requirements produce longer postreinforcement pauses. T you will probably take a longer break after complet- ing a long assignment than after completing a short one. S
If a dog receives a treat each time it begs for one, its begging is being maintained on a(n)__________ schedule of reinforcement. If it only sometimes receives a treat when it begs for one, its begging is being maintained on a(n)______________ schedule of reinforcement.
continuous (or FR1 or CRF); intermittent
On a c__________ reinforcement schedule (abbreviated ), each response is reinforced, whereas on an i reinforcement schedule, only some responses are reinforced. The latter is also called a p reinforcement schedule.
continuous/ intermittent
During the time that a rat is responding for food on a VR 100 schedule, we begin delivering additional food on a VT 60-second schedule. As a result, the rate of response on the VR schedule is likely to (increase/decrease/remain unchanged) .
decrease.
Schedules in which the reinforcer is easily obtained are said to be very__________ while schedules in which the reinforcer is difficult to obtain are said to be________.
dense or rich/ very lean Thus, an FR 5 schedule is considered a very dense schedule of reinforcement compared to an FR 100. During a 1-hour session, a rat can earn many more food pellets on an FR 5 schedule than it can on an FR 100. Likewise, an assembly line worker who earns a dollar for each car- buretor assembled (a CRF schedule) is able to earn considerably more during an 8-hour shift than is a worker who earns a dollar for every 10 carburetors assembled (an FR 10 schedule).
An FR 12 schedule of reinforcement is (denser/leaner) than an FR 75 schedule.
denser
The abbreviation DRL refers to reinforcement of rate behavior.
differental low rate
steady-state behaviors,
each one along with the characteristic response pattern produced by each. Note that this characteristic response pattern is the stable pattern that emerges once the organism has had considerable exposure to the schedule. Such stable patterns are known as _____________
continuous reinforcement schedule ( when is it most useful) what is an example of it)
each specified response is reinforced.example, each time a rat presses the lever, it obtains a food pellet; each time the dog rolls over on command, it gets a treat; and each time Karen turns the ignition in her car, the motor starts. CRF) is very useful when a behavior is first being shaped or strengthened. aka when using a shaping procedure.
Example of continuous reinforcement
each time a rat presses the lever, it obtains a food pellet; each time the dog rolls over on command, it gets a treat; and each time Karen turns the ignition in her car, the motor starts. Continuous rein- forcement (abbreviated CRF) second if we wish to encourage a child to always brush her teeth before bed, we would do well to initially praise her each time she does so
example of a fixed ratio ( FR) Schedule
earning a dollar for every 10 carburetors assembled on an assembly line is an example of an FR 10 schedule, while earning a dollar for each carburetor assembled is an example of an FR 1 schedule. Note that an FR 1 schedule is the same as a CRF (continuous reinforcement) schedule in which each response is reinforced (thus, such a schedule can be correctly labeled as either an FR 1 or a CRF).
Graduate students often have to complete an enormous amount of work in the initial year of their program. For some students, the workload involved is far beyond anything they have previously encountered. As a result, their study behavior may become increasingly (erratic/stereotyped) throughout the year, a process known as _______________,_________________
erratic and ratio strain.
Graduate students often have to complete an enormous amount of work in the initial year of their program. For some students, the workload involved is far beyond anything they have previously encountered. As a result, their study behavior may become increasingly (erratic/stereotyped) throughout the year, a process known as r
erratic/ burnout ratio strained.
In the example bus, I will probably engage in (few/frequent) glances at the start of the interval, followed by a gradually (increasing/decreasing) rate of glancing as time passes.
few then increasing.
In general,______ schedules produce postreinforcement pauses because obtaining one reinforcer means that the next reinforcer is necessarily quite (distant/close) .
fixed distant.
On a(n) schedule, a response cannot be reinforced until 20 seconds have elapsed since the last reinforcer. (A) VI 20-sec, (B) VT 20-sec, (C) FT 20-sec, (D) FI 20-sec, (E) none of the preceding.
fixed interval 20 second D)
. Postreinforcement pauses are most likely to occur on which two types of simple intermittent schedules?
fixed interval and fixed ratio
example of what? If I have just missed the bus when I get to the bus stop, I know that I have to wait 15 minutes for the next one to come along. Given that it is absolutely freez- ing out, I snuggle into my parka as best I can and grimly wait out the interval. Every once in a while, though, I emerge from my cocoon to take a quick glance down the street to see if the bus is coming. My behavior of looking for the bus is on a(n)_______- (use the abbreviation) schedule of reinforcement.
fixed interval.
On a_______ schedule (abbreviated ), reinforcement is contingent upon the first response after a fixed period of time. This pro- duces a______ pattern of responding.
fixed interval; FI; scalloped
what are the four or basic types of (or simple) types of intermittent schedules:
fixed ratio, variable ratio, fixed interval, and variable interval.
what are the four types of intermittent schedules
fixed ratio, variable ratio, fixed interval, and variable interval.
On a schedule (abbreviated ), reinforcement is contingent upon a fixed, predictable number of responses. This pro- duces a rate of response often accompanied by a.
fixed ratio; FR; high; post-reinforce- ment paus
A__________ schedule generally produces a high rate of response with a short pause following the attainment of each reinforcer. In general, the higher the requirement, the (longer/shorter) the pause
fixed ratio; longer
On a schedule (abbreviated ), the reinforcer is delivered following a fixed interval of time, regardless of the organism's behavior.
fixed time; FT
stretching the ratio"—moving from a low ratio requirement (a dense schedule) to a high ratio requirement (a lean schedule)—should be done how? think of a proper function of this process.
gradually once lever pressing is well established on a CRF schedule, the requirement can be gradually increased to FR 2, FR 5, FR 10, and so on. If the requirement is increased too quickly—for example, CRF to FR 2 and then a sudden jump to FR 20—the rat's behavior may become erratic and even die out altogether. Likewise, if you try to raise the requirement too high—say, to FR 2000—there may be a similar breakdown in the rat's behavior.
VR schedules generally produce low and erratic or high and steady rate of response? often with little, no or alot of postreinforcement pause?
high and steady and little pause.
A variable ratio schedule typically produces a (high/low) rate of behavior (with/without) a postreinforcement without a post-reinforcement pause.
high/ without.
Different response requirements have different effects on behavior. For example, ratio schedules (FR and VR) tend to produce (higher/ lower) rates of behavior than interval schedules (VI and FI). Likewise, fixed (FI and FR) schedules tend to produce__________________ whereas variable (VI and VR) schedules often do not. Such differences in response patterns are known as_____________ .
higher; postreinforcement pauses; schedule effects
Abraham Maslow (1971), another famous humanistic psychologist, argued that ?
hild rearing should be neither too restrictive nor too lenient, which in behavioral terms can be taken to imply that the social reinforcement children receive should be neither exces- sively contingent nor excessively noncontingent.
chained schedule differs from a conjunctive schedule how?
in that the two component schedules must be completed in a particular order, which is not required in a conjunctive schedule.
adjunctive behaviors.
innate tendencies, almost like fidgeting behaviors, that are often elicited during a period of waiting (Staddon & Simmelhag, 1971). This is the counter to skinner by other behaviorists.
On_______ schedules, the reinforcer is largely time contingent, meaning that the rapidity with which responses are emitted has (little/considerable) effect on how quickly the reinforcer is obtained.
interval/ little
conjunctive schedule
is a type of complex sched- ule in which the requirements of two or more simple schedules must be met before a reinforcer is delivered. The wages you earn on a job are contingent upon working a certain number of hours each week and doing a sufficient amount of work so that you will not be fired.
goal gradient effect. def
is an increase in the strength and/or efficiency of responding as one draws near to the goal. example rats running through a maze to obtain food tend to run faster and make fewer wrong turns as they near the goal box (Hull, 1932). Similarly, a student writing an essay is likely to take shorter breaks and work more intensely as she nears the end.
Variable Ratio Schedules
is contingent upon a varying, unpredictable number of responses.
(DRP), reinforcement i
is contingent upon emitting a series of responses at a set rate—or, more gener- ally, reinforcement is provided for responding neither too fast nor too slow.
continuous reinforcement schedule def
is one in which each specified response is reinforced.
intermittent (or partial) reinforcement schedule
is one in which only some responses are reinforced. example perhaps only some of the rat's lever presses result in a food pellet, and perhaps only occasionally did your mother give you a cookie when you asked for one. example in everyday life Not all concerts we attend are enjoyable, not every person we invite out on a date accepts, and not every date that we go out on leads to an enjoyable evening.
schedule of reinforcement
is the response requirement that must be met to obtain reinforcement. ( indicates what exactly has to be done for the reinforcer to be delivered. )
In many mixed martial arts matches, each fighter typically receives a guaranteed purse, regardless of the outcome. In the Ultimate Fighter series, the winner of the final match is awarded a major contract in the UFC while the loser receives nothing. As a result, Dana is not surprised when he notices fighters in the latter event (more/less) often fighting to the point of complete exhaustion, since the monetary reinforcer tied to the match is (contingent / not contingent) upon winning the match. also an Example of what.
less/ not contingent ( talking about What happens if a noncontingent schedule of reinforcement is superim- posed on a regular, contingent schedule of reinforcement)
In general, do higher ratio requirements produce longer or shorter pauses?
longer hi for hi low for low. with very low ratios they be little or no pause at all. In such cases, the next reinforcer is so close—only a few lever presses away— that the rat is tempted to immediately go back to work.
Variable ratio schedules help to account for the persistence with which some people display what type of behaviors?
maladaptive behaviors Gambling is a prime example in this regard: The unpredictable nature of these activities results in a very high rate of behavior.
cons and pros of non contingent reinforcement
may account for some forms of superstitious behavior. skinner pigions reacted greatly to the free reinforcers . Skinner believed these behaviors developed because they had been accidentally reinforced by the coincidental presentation of food.
On a schedule, a response must not occur until 20 seconds have elapsed since the last reinforcer. (A) VI 20-sec, (B) VT 20-sec, (C) FT 20-sec, (D) FI 20-sec, (E) none of the preceding.
none of the preceding D,
intermittent reinforcement schedule aka p def
one in which only some responses are reinforced.
postives and drawbacks to non contigent reinforcement. At this point, you might be thinking that noncontingent reinforcement is all bad, given that it leads to superstitious behavior in some situations and to poor performance in others. In fact, noncontingent reinforcement is some- times quite beneficial. More specifically, it can be an effective means of reducing the frequency of maladaptive behaviors. For example, children who act out often do so to obtain attention. If, however, they are given a sufficient amount of attention on a noncontingent basis, they will no longer have to act out to obtain it. Noncontingent reinforcement has even been shown to reduce the frequency of self-injurious behavior. Such behavior, which can consist of head-banging or biting chunks of flesh out of one's arm, is sometimes displayed by people who suffer from retardation or autism; it can be notoriously difficult to treat.
positives and drawbacks to non contigent reinforcement. At this point, you might be thinking that noncontingent reinforcement is all bad, given that it leads to superstitious behavior in some situations and to poor performance in others. In fact, noncontingent reinforcement is some- times quite beneficial. More specifically, it can be an effective means of reducing the frequency of maladaptive behaviors. For example, children who act out often do so to obtain attention. If, however, they are given a sufficient amount of attention on a noncontingent basis, they will no longer have to act out to obtain it. Noncontingent reinforcement has even been shown to reduce the frequency of self-injurious behavior. Such behavior, which can consist of head-banging or biting chunks of flesh out of one's arm, is sometimes displayed by people who suffer from retardation or autism; it can be notoriously difficult to treat.
VI rates produce what?
predictable response rates, as well as pre- dictable rates of reinforcement,
proper child rearing requires healthy doses of both noncontingent reinforce- ment, which gives the child a secure base from which to explore the world and take risks, and contingent reinforcement, which helps to shape the child's behavior in appropriate ways, maximize skill development, and pre- vent the development of passivity. T
proper child rearing requires healthy doses of both noncontingent reinforce- ment, which gives the child a secure base from which to explore the world and take risks, and contingent reinforcement, which helps to shape the child's behavior in appropriate ways, maximize skill development, and pre- vent the development of passivity. T
As with an FR schedule, an extremely lean VR schedule can result in______________
ratio strain.
in general, (ratio/interval) schedules tend to produce a high rate of response. This is because the reinforcer in such schedules is entirely r_____ contingent, meaning that the rapidity with which responses are emitted (does / does not) greatly affect how soon the reinforcer is obtained.
ratio/ responce/ does/
A pigeon pecks a green key on a VR 9 schedule, then a red key on an FI 20-sec, following which it receives food. The reinforcer for pecking the green key is the presentation of the_________ , which is a_________ reinforcer.
red key; secondary
unconditional positive regard what does it mean what was ment by it. who said it carl rogers.
refers to the love, respect, and acceptance that one receives from significant others, regardless of one's behavior. Rogers assumed that such regard is a necessary precondition for the develop- ment of a healthy personality. beneficial effects of noncontingent reinforcement can be seen as providing empirical support for the value. carl rogers.
fixed ratio (FR) schedule, what is reinforcement contingent upon?
reinforcement is contingent upon a fixed, predictable number of responses.
variable ratio (VR) schedule is reliant on what/ def:
reinforcement is contingent upon a varying, unpredictable number of responses.
differential reinforcement of high rates (DRH), reinforcement is contingent upon what? and DEF.
reinforcement is contingent upon emitting AT LEAST a certain number of responses in a certain period of time—or, more generally, reinforcement is provided for respond- ing at a fast rate. ____________means simply that one type of response is reinforced while another is not.
differential reinforcement of paced responding (DRP),
reinforcement is contingent upon emitting a series of responses at a set rate—or, more gener- ally, reinforcement is provided for responding neither too fast nor too slow.
fixed interval (FI) schedule, DEF what is reinforcement contingent upon.
reinforcement is contingent upon the first response after a fixed, predictable period of time. For a rat on a __________ 30-second (____ 30-sec) schedule, the first lever press after a 30-second interval has elapsed results in a food pellet. Following that, another 30 seconds must elapse before a lever press will again produce a food pellet. Any lever pressing that occurs during the interval, before the 30-second period has elapsed, is ineffective. On a pure FI schedule, any responding that happens during the interval is essen- tially irrelevant.)
variable interval (VI) schedule, def? what is reinforcement contingent upon?
reinforcement is contingent upon the first response after a varying, unpredictable period of time. For a rat on a variable interval 30-second (VI 30-sec) schedule, the first lever press after an average interval of 30 seconds will result in a food pellet, with the actual interval on any particular trial varying between, Thus, the number of seconds that must pass before a lever press will produce a food pellet could be 8 seconds for the first food pellet, 55 seconds for the second pellet, 24 seconds for the third, and so on, the average of which is 30 seconds.
On_______ schedules, reinforcement is contingent upon the rate of response.
response rate (DRL, DRH, and DRP)
A__________ is the response requirement that must be met to obtain reinforcement.
schedule of reinforcement (reinforcement schedule)
A s_______ of reinforcement is the r__________ requirement that must be met in order to obtain reinforcement.
schedule of reinforcement.
A s_____of reinforcement is the r______ requirement that must be met in order to obtain reinforcement.
schedule of reinforcement/ Responce
FR( fixed ratio schedules) generally produce what kind of responce? what would it look like on a chart, what would happen after the reinforcement happened?
schedules generally produce a high rate of response along with a short pause following the attainment of each reinforcer (see Figure 7.1). This short pause is known as a post-reinforcement pause. For example, a rat on an this schedule will rapidly emit 25 lever presses, munch down the food pellet it receives, and then snoop around the chamber for a few seconds before rapidly emit- ting another 25 lever presses. In other words, it will take a short break following each reinforcer. each pause is usually followed by a relatively quick return to a high rate of response. Thus, this typical pattern is described as a "break- and-run" pattern—a short break followed by a steady run of responses. students sometimes find that when they finally sit down to start work on the next chapter or assignment, they quickly become involved in it. Perhaps this is why just starting a task is often the most important step in overcoming procrastination; once you start, the work often flows naturally
Over a period of a few months, Aaron changed from complying with each of his mother's requests to complying with every other request, then with every third request, and so on. The mother's behavior of making requests has been subjected to a procedure known as "s the r ."
stretching the ratio.
fixed duration (FD) schedule, def the behavior must be doing what.
the behavior must be performed continuously for a fixed, predictable period of time.
variable duration (VD) schedule, def/ the behavior must do what?
the behavior must be performed continuously for a varying, unpredictable period of time.
example of a fixed duration ratio.
the rat must run in the wheel for 60 seconds to earn one pellet of food (an _____ 60-sec schedule). Likewise, Julie may decide that her son can watch television each evening only after he completes 2 hours of studying (an____ 2-hr schedule).
fixed time (FT) schedule
the reinforcer is delivered following a fixed, predictable period of time, regardless of the organism's behavior. _______ schedules involve the delivery of a "free" reinforcer following a predictable period of time.
variable time (VT) schedule,
the reinforcer is delivered following a varying, unpredictable period of time, regardless of the organism's behavior.
noncontingent schedule of reinforcement def
the reinforcer is delivered independently of any response. Aka response-independent schedules. In other words, a response is not required for the reinforcer to be obtained.
adjusting schedule, process of shaping also involves an adjusting schedule
the response requirement changes as a function of the organism's performance while responding for the previous reinforcer. rat completes all 100 responses within a 5-minute interval, we may then increase the requirement to 110 responses (FR 110). In other words, because it has performed so well, we expect even better performance in the future. In a similar fashion, when Seema displayed excellent ability in mastering her violin lessons, she and her parents decided to increase the amount she had to learn each week. The requirement for reinforcement changes as soon as the rat has successfully met the previous requirement.
example of a fixed interval schedule.
trying to phone a business that opens in exactly 30 minutes will be effective only after the 30 minutes have elapsed, with any phone calls before that being ineffective.
In general, (variable/fixed) schedules produce little or no postrein- forcement pausing because such schedules often provide the possibility of relatively i_________ reinforcement, even if one has just obtained a reinforcer.
variable
You find that by frequently switching stations on your radio, you are able to hear your favorite song an average of once every 20 minutes. Your behavior of switching stations is thus being reinforced on a___________ schedule.
variable interval ( VI) .
If Jason is extremely persistent in asking Neem out for a date, she will occasionally accept his invitation. Of the four basic schedules, Jason's behavior of asking Neem for a date is most likely______________ on a schedule of reinforcement.
variable ratio
An average of 1 in 10 people approached by a panhandler actually gives him money. His behavior of panhandling is on a________ schedule of reinforcement.
variable ratio.
As noted in the opening scenario to this chapter, Mandy found that she had to work harder and harder to entice Alvin to pay attention to her. It is quite likely that her behavior was on a________ schedule of reinforcement. As a result, she began experiencing periods of time where she simply gave up and stopped trying. Eventually, she stopped seeing him altogether. When her sister asked why, Mandy, having just read this chapter, replied, "____ _____ ."
variable ratio; ratio strain
Dersu often carried a lucky charm with him when he went out hunting. This is because the appearance of game was often on a (use the abbreviation) schedule of reinforcement.
variable time (VT) schedule. A schedule in which the reinforcer is delivered following a varying, unpredictable period of time, regardless of the organism's behavior. ( answer for a question)
What do the four basic schedules have in common why are they different
vary in both the rate of responce and in the presence or absence of a post-reinforcement pause. ( Ratio schedules ( FR AND VR) produce higher rates of response than do interval schedules ( FI and VI), interval schedule reinforcement is mostly time contingent. for example on an F1 minute schedule no more than 50 reinforcers can be earned in a 50 min session. fixed schedules ( FR and FI) tend to produce post reinforcement pauses, where as vaible schudlues ( VR and VI do not).
Schedules in which the reinforcer is easily obtained are said to be what________while schedules in which the reinforcer is difficult to obtain are said to be what________ think of and example here
very dense or rich, very lean. Likewise, an assembly line worker who earns a dollar for each car- buretor assembled (a CRF schedule) is able to earn considerably more during an 8-hour shift than is a worker who earns a dollar for every 10 carburetors assembled (an FR 10 schedule)
Is responding stronger or weaker in the eailer links of the chain then the later links in the chain and why?
weaker in the earlier links and stonger in the later is why you should start from the later chains and work your way backwards. why: in the later links, the terminal reinforcer is more immediate and hence more influential; while in the early links, the terminal reinforcer is more distant and hence less influential (remember that delayed reinforcement is less effective than immediate reinforcement). pg 290 Another way of looking at it is that the secondary reinforcers supporting behavior in the early links are less directly associated with food and are therefore relatively weak (e.g., the green key is associated with food only indirectly through its associa- tion with the red key). From this perspective, a chained schedule can be seen as the operant equivalent of higher-order classical conditioning—in which, for example, a tone (CS1) associated with food (US) elicits less salivation than the food does, and a light (CS2) associated with the tone elicits less salivation than the tone does. S
when is continuous reinforcement most sucessful
when the behavior is being strengthened or shaped.