Learning Principles Unit 4

Ace your homework & exams now with Quizwiz!

On a fixed interval (FI) schedule, what is the "rule" that describes whether a particular response will produce the reinforcer? (see book pp. 200-202)

First response after 15 seconds have passed means reinforcer

On a ____ (FI) schedule of reinforcement, each reinforcer becomes available after a constant, unchanging amount of time has passed.

Fixed Interval

Review the four simple schedules of reinforcement

Fixed ratio (FR) Variable Ration (VR) Fixed interval (FI) Variable Interval (VI)

On a ____ (FR) schedule of reinforcement, the animal must perform the response a constant, unchanging number of times for each reinforcer.

fixed ratio

Whenever Chuck mispronounces a word, his wife, Evelyn, calls him an "idiot." However, despite this, Chuck still mispronounces words just as frequently as he always has (i.e., there is no evidence that Evelyn's attempts to reduce the frequency of Chuck's mispronunciations have been effective). We can therefore conclude that: A) Chuck's behavior has not been successfully punished B) Chuck is deliberately mispronouncing words in order to annoy Evelyn C) Deep down, Evelyn unconsciously likes it when Chuck mispronounces words D) Evelyn's attempts to positively punish Chuck's behavior have actually resulted in negative punishment

A) Chuck's behavior has not been successfully punished

The schedule of reinforcement that you would use if you wanted to produce the most rapid learning of a new behavior would be: A) continuous reinforcement B) intermittent reinforcement C) a fixed interval schedule D) a variable ratio schedule

A) continuous reinforcement

Punishers are most likely to be effective when they are: A) delivered on a continuous schedule B) initially mild, then progressively more intense C) not contingent on the performance of any particular behavior D) (all of the above)

A) delivered on a continuous schedule

According to the _____ hypothesis, behaviors are more resistant to extinction following intermittent reinforcement than they are following continuous reinforcement because the animal finds it more difficult to distinguish between intermittent reinforcement and extinction than it does to distinguish between continuous reinforcement and extinction. A) discrimination B) frustration C) sequential D) response unit

A) discrimination

Jessica often whines for extra helpings when she is given dessert at the dinner table. In order to put an end to the whining, Jessica's father decides that, whenever Jessica whines, she will not receive extra dessert. (In other words, Jessica's father has decided to stop reinforcing Jessica's whining behavior.) The strategy that Jessica's father is attempting to implement would be best described as a(n) _____ procedure. A) extinction B) positive punishment C) response prevention D) negative reinforcement

A) extinction

John spent his summer picking cantaloupes for a farmer. The farmer paid John $1 for every 20 cantaloupes he picked. John worked on a _____ schedule of reinforcement. A) fixed ratio B) variable ratio C) fixed interval D) variable interval

A) fixed ratio

The reappearance, during extinction, of a previously reinforced (and extinguished) behavior is known as: A) resurgence B) ratio strain C) progression D) matching

A) resurgence

The word "positive" in "positive punishment" refers to the fact that: A) something is added as a consequence of the behavior B) something positive is removed as a consequence of the behavior C) the procedure has beneficial results D) the procedure is used with good intentions

A) something is added as a consequence of the behavior

In a "cumulative record," the horizontal (X) axis represents _____, and the vertical (Y) axis represents _____. A) the passage of time; the total number of times the behavior has occurred B) the total number of times the behavior has occurred; the passage of time C) the number of responses required for each reinforcer; the magnitude of the reinforcer D) the magnitude of the reinforcer; the number of responses required for each reinforcer

A) the passage of time; the total number of times the behavior has occurred

"Differential reinforcement of ____ behavior" (DRA) is a technique that involves extinguishing an unwanted behavior while reinforcing another, more appropriate behavior.

Alternative

How is the sequential hypothesis similar to the frustration hypothesis? How is it different?

An animal on a partial reinforcement schedule learns to associate its memories of non-reinforced responses with the eventual delivery of the reinforcer.

Which of the following schedules does NOT require the performance of a particular behavior? A) Continuous reinforcement B) A fixed time (FT) schedule C) A fixed interval (FI) schedule D) A fixed duration (FD) schedule

B) A fixed time (FT) schedule

Which of the following has NOT been shown to increase the effectiveness of a punisher? A) Providing a verbal explanation for why the punisher was administered. B) Using a mild punisher at first and then gradually increasing its intensity. C) Strengthening the contingency between the response and the punisher. D) Offering an alternative way to obtain the reinforcer that previously maintained the behavior.

B) Using a mild punisher at first and then gradually increasing its intensity.

"Given a choice between two alternative behaviors, each on its own schedule of reinforcement, animals tend to distribute their responses in a way that corresponds to the frequency with which reinforcers are available on each schedule." This statement best describes the: A) "break point" B) matching law C) response unit hypothesis D) partial reinforcement effect (a.k.a. partial reinforcement extinction effect)

B) matching law

Delaying the delivery of a punisher is likely to: A) increase its effectiveness B) reduce its effectiveness C) have no impact on its effectiveness D) have a variable and unpredictable impact on its effectiveness

B) reduce its effectiveness

The term "_____" refers to the rate at which a behavior occurs once it has resumed following reinforcement. A) latency B) run rate C) cumulative record D) post-reinforcement pause

B) run rate

Why can responding on a fixed interval schedule be described as having a "scalloped" pattern?

Because it's more gradual, these curves are scalloped shaped

Why is continuous reinforcement an example of a fixed ratio schedule?

Because you keep doing the response in order to get the rewards

When a previously reinforced behavior first stops getting reinforced, the immediate effect is often an abrupt increase in that behavior. This is called an "extinction ____."

Burst

Which of the following best describes the pattern of responding that variable ratio schedules tend to produce? A) A high run rate followed by long pauses after reinforcers B) A slow run rate followed by long pauses after reinforcers C) A high, steady rate of responding with little or no pausing after reinforcers D) A slow, steady rate of responding with little or no pausing after reinforcers

C) A high, steady rate of responding with little or no pausing after reinforcers

Often, the initial effect of extinction is an increase in the frequency of the behavior that is being extinguished. This phenomenon is called: A) ratio strain B) stretching the ratio C) an extinction burst D) spontaneous recovery

C) an extinction burst

David stayed out too late one evening and missed his curfew. After being punished by his parents, David walked into his younger brother's room, picked a fight, and then physically attacked his brother. This example best illustrates that _____ can occur as a "side effect" of punishment. A) conditioned fear B) extinction bursts C) displaced aggression D) learned helplessness

C) displaced aggression

When reinforcers are contingent on the continuous performance of a behavior for a certain period of time, a _____ schedule of reinforcement is in effect. A) fixed time (FT) B) fixed interval (FI) C) fixed duration (FD) D) progressive ratio (PR)

C) fixed duration (FD)

A cumulative record tends to show a "scalloped" pattern of responding when the subject is on a _____ schedule of reinforcement. A) fixed ratio B) variable ratio C) fixed interval D) variable interval

C) fixed interval

True or false? On a fixed interval schedule, reinforcers occur periodically regardless of what the animal does.

False

In a _____ schedule, a behavior is under the influence of different schedules at different times, and each schedule is associated with a particular stimulus. However, in _____ schedules, two or more schedules are available at once (e.g., a pigeon may have the option of pecking one key on a VI 20-sec. schedule or pecking a different key on a VI 40-sec. schedule). A) mixed; tandem B) tandem; mixed C) multiple; concurrent D) concurrent; multiple

C) multiple; concurrent

The more work required for each reinforcer, the longer the post-reinforcement pause. Thus, pauses are longer in an FR 100 schedule than in an FR 20 schedule. Research with "multiple" schedules of reinforcement has shown that these pauses do not occur due to fatigue; rather, the length of the pause depends on the size of the upcoming ratio (not the size of the just-completed ratio). Because of this, "post-reinforcement pauses" are now often referred to as "_____." A) bliss points B) rest periods C) pre-ratio pauses D) variable intervals

C) pre-ratio pauses

For several days, a rat in a Skinner box has received food after every 10 presses of a bar. However, when the researchers suddenly increase the number of required bar-presses from 10 to 200, they observe a severe disruption of the rat's bar-pressing behavior. This phenomenon is known as: A) regression B) resurgence C) ratio strain D) the partial reinforcement effect (a.k.a. partial reinforcement extinction effect)

C) ratio strain

George is training a pigeon to peck a key. He begins by reinforcing each key-peck that the pigeon performs. Once the response is learned, however, George stops reinforcing every key-peck and begins reinforcing every other key-peck. Later, George reinforces every third key-peck, then every fifth key-peck, then every tenth key-peck, and so on. Because he is gradually increasing the requirements for each reinforcer, George is using a procedure that researchers call "_____." A) matching B) resurgence C) stretching the ratio D) continuous reinforcement

C) stretching the ratio

Suppose that Shawna is a bowler who gets a "strike" on approximately one out of every three attempts. However, sometimes, Shawna bowls two or three strikes in a row, and other times, she goes four attempts or more without bowling a strike. This is an example of a _____ schedule of reinforcement. A) fixed ratio B) continuous C) variable ratio D) fixed interval

C) variable ratio

In ____ schedules, two or more schedules are available at once (e.g., a pigeon may have one key that it can peck on a FR 10 schedule and another key that it can peck on a VR 15 schedule).

Concurrent

Immediate punishers are more effective than delayed punishers. This illustrates the importance of the ____ between the response and the punisher.

Contiguity

There is a "____" between a response and a punisher if performing the response makes the punisher more likely to occur.

Contingency

In ____ reinforcement, a behavior is reinforced every time it occurs.

Continuous

continuous reinforcement vs. Partial Reinforcement: Which one results in faster acquisition of new behaviors?

Continuous

Robert was severely scolded by Mrs. Valentine for asking inappropriate questions during class. Which of the following problem(s) could arise as a result of Mrs. Valentine's use of punishment? A) Robert might become fearful of Mrs. Valentine B) Robert might begin to scold his siblings and peers C) Robert might refuse to say anything at all during class D) (all of the above)

D) (all of the above)

In one alternative to punishment, called "_____," the environment is altered in such a way as to block the occurrence of an unwanted behavior. A) extinction B) "Time Out" C) penalty training D) response prevention

D) response prevention

Behaviors tend to be more resistant to extinction following intermittent reinforcement than they are following continuous reinforcement. This is known as: A) spontaneous recovery B) the progressive ratio effect C) the extinction burst phenomenon D) the partial reinforcement effect (a.k.a. partial reinforcement extinction effect)

D) the partial reinforcement effect (a.k.a. partial reinforcement extinction effect)

A hungry pigeon is in a Skinner box and is pecking a key for access to food. The first food delivery occurs for the first peck after 1 minute has elapsed. The second food delivery occurs for the first peck after 3 additional minutes have elapsed. The third food delivery occurs for the first peck after 2 additional minutes have elapsed. This pigeon is being reinforced on a _____ schedule of reinforcement. A) continuous B) variable ratio C) fixed interval D) variable interval

D) variable interval

What is an "extinction burst," and why can it be a problem in behavioral therapy?

Definition: a sudden increase in the rate of behavior during the early stages of extinction. Problem: when attempting to extinguish an undesirable behavior, the behavioral problem may initially get worse

According to the ____ hypothesis, extinction takes longer after partial reinforcement because it is more difficult to distinguish between partial reinforcement and extinction than it is to distinguish between continuous reinforcement and extinction.

Discrimination

Review the "3-term contingency" that operant conditioning involves

Discriminative stimulus, Response, outcome

An animal experiences "____" when a previously reinforced behavior stops being reinforced.

Extinction

How do "fixed" schedules differ from "variable" schedules?

Fixed schedules happen at the same time. In Variable, the reward will be given after the animal gives the response.

The ____ hypothesis claims that, during partial reinforcement, animals learn to associate their emotional reaction to nonreinforced responses with the eventual delivery of the reinforcer.

Frustration

How does the "matching law" suggest that an animal will behave when it is given a choice between two behaviors, each of which is on its own schedule of reinforcement?

Given a choice situation involving 2 alternatives, the frequency with which the animal chooses one alternative will "match" the frequency with which the animal can obtain reinforcers via this alternative. Know the basic equation that summarizes the matching law (see below)

What is represented on the horizontal and vertical axes of the cumulative record?

Horizontal: the passage of time Vertical: total responses

is Providing a verbal rationale for why the punisher occurred helpful in punishment?

If a verbal reprimand was given, organisms are less likely to do it again.

How can "two-process theory" be applied to punishment?

If an animal performs a response and then experiences the punisher, It associates the stimuli that were present before it performed the response with the aversive aspects of the punisher. So, the rat stops pressing the lever, because it fears the lever

How does the intensity of the punisher tend to affect the reduction of the punished response?

If an organism is exposed to a mild shock first, their response to the larger shock is less so then those who weren't exposed to a previous shock. The more intense punisher seemed worked well. Decreasing the behavior and producing more of a suppression then mild punisher

How is resistance to extinction related to the predictability of reinforcers?

If it's more predictable, you are more likely to give up once it stops the first time. Whereas if it varies, you will keep trying because you don't know when it will produce the reinforcement.

How did Jenkins (1962) cast doubt on the discrimination hypothesis?

If partial reinforcement is followed by continuous reinforcement, animals still display greater resistance to extinction

How does continuous reinforcement differ from partial (or "intermittent") reinforcement?

In continuous rein. Every response gets reinforced, while in intermittent rein. Not all responses get reinforced.

continuous reinforcement vs. Partial Reinforcement: Which one results in behaviors that are more "resistant to extinction"?

Intermittent

How does the variability of the animal's behavior change during extinction?

It is increased! Animals seems to realize, during extinction, that it needs to "try something else."

What effect (if any) does the size of the ratio have on the run rate and the length of post-reinforcement pauses?

Longer ratio means a longer post-reinforcement pause, but no effect on run rate.

"Differential reinforcement of ____" (DRL) is a technique that involves reinforcing a behavior only if a certain amount of time has passed since the behavior was last performed.

Low rate

What effect does extinction have on emotional behavior (e.g., aggression)?

May increase the frequency of it (ex) we pay for vending machine food, and it gets stuck. What do we do? Kick it)

What is the pattern of responding that VI schedules tend to produce?

Moderate response rate with steady responding

Which hypothesis tends to be supported by research?

Multiple schedule reinforcement

On a variable ratio (VR) schedule, what is the "rule" that describes whether a particular response will produce the reinforcer?

On average, reinforcer occurs after 10 responses - but, each individual reinforcer can occur after a different number of responses.

In ____ (or "intermittent") reinforcement, a behavior is reinforced on some occasions and not reinforced on other occasions.

Partial

How do "ratio" schedules differ from "interval" schedules?

Ratio describes how many while interval focuses on when it's happening.

The term "____ to extinction" refers to how long a behavior persists after it stops being reinforced.

Resistance

An alternative to punishment that involves altering the environment in such a way as to block an unwanted behavior from occurring.

Response prevention

According to the ____ hypothesis, the partial reinforcement extinction effect is merely a by-product of the fact that reinforcers require more responses on partial schedules than they do on continuous schedules.

Response unit

What effects does extinction have on the animal's behavior?

Responses become weaker, extinction burst

When one behavior is extinguished, previously extinguished behaviors may reappear. This phenomenon is known as "____."

Resurgence

What is a schedule of reinforcement? What are "schedule effects"? (see book p. 194)

Schedule of reinforcement: A rule that specifies which occurrence of a particular response will be followed by a reinforcer.

The ____ hypothesis claims that, during partial reinforcement, animals learn to associate their memories of nonreinforced responses with the eventual delivery of the reinforcer.

Sequential

Last name of the American psychologist who invented the "operant chamber" and conducted extensive research on schedules of reinforcement.

Skinner

What does the slope (steepness) of the line (on a cumulative record) tell us about the animal's behavior?

Steep line means the animal is responding rapidly Flatter line means the animal is responding more slowly

Why might immediate punishers be more effective than delayed punishers?

The "delay" could cause the animal to forget that it performed the behavior that it is getting punished for. When the punishment is delayed, the animal is doing other things, so it may think it's getting punished for these things

How does the discrimination hypothesis account for PRE?

The difference between continuous reinforcement and extinction is greater than the difference between partial reinforcement and extinction. So, it's more difficult for the animal to distinguish (discriminate) between partial reinforcement and extinction.

How does the frustration hypothesis explain the PRE?

The frustrated emotional state becomes a SD (discriminative stimulus) for the animal, it is a signal to it that if they are frustrated, the reward will eventually come.

On a variable interval (VI) schedule, what is the "rule" that describes whether a particular response will produce the reinforcer?

The interval changes from one reinforcer to the next

What happens (or, rather, does not happen) during "extinction" in operant conditioning?

The reinforcer no longer follows the response (even when SD is present)

On a fixed ratio (FR) schedule, what is the "rule" that describes whether a particular response will produce the reinforcer?

The reinforcer occurs after a certain number of responses have been made

What is the "partial reinforcement effect" (PRE; also known as the "partial reinforcement extinction effect," or PREE)?

The schedule of reinforcement that was in effect prior to extinction.

Why can the pattern of behavior on FR schedules be described as "stop-and-go"?

There will be a period where the animal stops responding to the reinforcer, but once responding resumes, it will continue at a high rate until the animal learns the next reinforcer.

What did Thorndike and Skinner conclude regarding the effectiveness of punishment?

They viewed that punishers didn't work. Thye thought that extinction was just as effective as punishment

How does the overall response rate differ on ratio vs. interval schedules?

This has a high response rate with steady responding

True or false? A continuous schedule of punishment is typically more effective than a partial schedule in weakening a behavior.

True

On a ____ (VR) schedule of reinforcement, the animal must perform the response a different, changing number of times for each reinforcer.

VAriable Ratio

Which type of schedule produces behavior that occurs at a steadier, more consistent rate (with less "pausing")—fixed or variable?

Variable

On a ____ (VI) schedule of reinforcement, each reinforcer becomes available after a different, changing amount of time has passed.

Variable Interval

What is a "cumulative recorder"? How does it work?

What it is: a device that was used to create graphs called "cumulative records" How it works: Anytime the animal poked a thing or press a bar, the pen would move.

What are "concurrent" schedules of reinforcement, and why are concurrent schedules often used in research on "choice"?

What they are: when an animal s presented with 2+ schedules Why are they used: Because two different schedules are in affect at the same time, so the animal is free to choose one over the other

An alternative to a punishment called "_____" involves extinguishing an unwanted behavior while also reinforcing a behavior that cannot be performed at the same time as the unwanted behavior. A) penalty training B) response prevention C) differential reinforcement of low rate (DRL) D) differential reinforcement of incompatible behavior (DRI)

differential reinforcement of incompatible behavior (DRI)

"Differential reinforcement of ____ behavior" (DRI) is a technique that involves extinguishing an unwanted behavior while reinforcing another behavior that cannot be performed at the same time as the unwanted behavior.

incompatable

The ____ is a principle of choice which says that, given the opportunity to respond on two or more schedules, the rates of responding on each schedule will match the rates at which reinforcers can be obtained on each schedule.

matching law

In a ____ schedule, a behavior is under the influence of two or more simple schedules at different times, with each schedule being associated with a particular stimulus.

multiple

What is a "post-reinforcement pause"?

no responses for a while after delivery of a reinforcer

What is the "run rate"?

once responding resumes, it continues at a high rate until the animal earns the next reinforcer.

What is "resurgence"?

the reappearance, during extinction, of other, previously reinforced (and extinguished) behaviors.

Which type of schedule produces behavior that is more resistant to extinction— fixed or variable?

variable


Related study sets

Chapter 32 serious Mental Illness

View Set

Gastrointestinal Disorders (prep U additional)

View Set

Pathogenesis, Disease, and Epidemiology

View Set