Chapter 7: Schedules of Reinforcement

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

True/False: Although important, the matching law is restricted to a narrow range of species, responses, reinforcers, and reinforcement schedules

False

True/False: In a multiple schedule, the organism is forced to choose between two or more reinforcement schedules.

False

A classic work on reinforcement schedules is by _________ . a. Darwin b. Herrnstein c. Ferster and Skinner d. Abercrombie and Fitch

Ferster and Skinner

True/False: The more effort a behavior requires, the fewer times the behavior will be performed during extinction.

True

True/False: The response unit hypothesis suggests that there really is no such thing as the partial reinforcement effect.

True

True/False: The thinner of two schedules, VR 5 and VR 10, is VR 10

True

True/False: When food is the reinforcer, it is possible to stretch the ratio to the point at which an animal expends more energy than it receives.

True

George trains a pigeon to peck a disk by reinforcing each disk peck. Once the response is learned, George begins to cut back on the reinforcers. At first he reinforces every other response, then every third response, every fifth response, every tenth response, and so on. George is using a procedure called _________. a. ratio tuning b. reinforcement decimation c. intermittent reinforcement d. stretching the ratio

stretching the ratio

Gradually reducing the frequency of reinforcement is called _________. a. extinction b. stretching the ratio c. intermittent reinforcement d. progressional reinforcement

stretching the ratio

A schedule in which reinforcement is contingent on the behavior of more than one subject is a _________. a. multiple schedule b. mixed schedule c. tandem schedule d. cooperative schedule

cooperative schedule

John spent his summer picking cantaloupes for a farmer. The farmer paid John a certain amount for every basket of cantaloupes picked. John worked on a _________. a. fixed ratio schedule b. variable ratio schedule c. fixed interval schedule d. variable interval schedule

fixed ratio schedule

A chain schedule is most like a _________ schedule. a. multiple b. mixed c. cooperative d. tandem

tandem

In CRF, the ratio of reinforcers to responses is 1 to 1; in FR 1, the ratio is _______.

1 to 1

_________ refers to the point at which a behavior stops or its rate falls off sharply. a. Block b.Border time c. Break point d. Camel's back

Break point

The schedule to use if you want to produce the most rapid learning of new behavior is _______. a. CRF b. FR 2 c. FI 3" d. VI 3"

CRF

Choice involves ________ schedules.

Concurrent

CRF stands for ________.

Continuous Reinforcement

True/False In VI schedules, the reinforcer occurs periodically regardless of what the organism does

False

A given reinforcement schedule tends to produce a distinctive pattern and rate of performance. These are called schedule _______. a. patterns b. profiles c. effects d. matrixes

Effects

The schedule that is likely to produce a cumulative record with scallops is the _________. a. FR schedule b. VR schedule c. FI schedule d. VI schedule

FI schedule

CRF is synonymous with _________. a. EXT b. FR 1 c. CRT d. FI 1

FR 1

The schedule that is not an intermittent schedule is _________. a. FR 1 b. FR 5 c. VR 1 d. VI 1"

FR 1

Of the following, the schedule that most closely resembles noncontingent reinforcement is _________. a. FD b. FT c. FI d. DRL

FT

A schedule that does not require the performance of a particular behavior is the _________. a. FT schedule b. FD schedule c. FI schedule d. FR schedule

FT schedule

The explanation of the PRE that puts greatest emphasis on internal cues is the ________ hypothesis

Frustration

When behavior is on a FR schedule, animals often discontinue working briefly following reinforcement. These periods are called ________.

Post-reinforcement pauses/pre-ratiopauses/between-ratio pauses

___________ schedules differ from other schedules in that the rules describing the contingencies change systematically. a. Adaptive b. Evolutionary c. Progressive d. Idiosyncratic

Progressive

_____________ led the way in the study of choice. a. Richard Herrnstein b. Clark Hull c. B. F. Skinner d. E. L. Thorndike

Richard Herrnstein

The rule describing the delivery of reinforcement is called a ________of reinforcement.

Schedule

The term ________ refers to the pattern and rate of performance produced by a particular reinforcement schedule.

Schedule effects

If you increase the requirements for reinforcement too quickly you are likely to see evidence of ratio _______.

Strain

True/False Extinction often increases the frequency of emotional behavior.

True

True/False Extinction often increases the variability of behavior

True

True/False One effect of the extinction procedure is an increase in the variability of behavior.

True

True/False One everyday example of a VR schedule is the lottery.

True

True/False When a response is placed on extinction, there is often an increase in emotional behavior

True

True/False: Harlan Lane and Paul Shinkman put a college student's behavior on extinction following VI reinforcement . The student performed the behavior 8,000 times without reinforcement.

True

True/False: One difference between FT and FI schedules is that in FT schedules, reinforcement is not contingent on a behavior

True

Your text reports the case of a man who apparently made hundreds of harassing phone calls. The man's behavior was most likley on a(n) _________. a. FR schedule b. VR schedule c. FI schedule d. VI schedule

VR schedule

In one form of the matching law, BA stands for the behavior under consideration and B0 represents _______. a. reinforcement for BA b. the baseline rate of BA c. all behaviors other than BA d. all behavior that is over expectation

all behaviors other than BA

Often the initial effect of an extinction procedure is an increase in the behavior called a(n) extinction ________. a. rebound b. resurgence c. burst d. flyer

burst

he immediate effect of extinction is often an abrupt increase in the rate of the behavior being extinguished. This is called an extinction ______

burst

Stanley wants to determine which of two reinforcement schedules is more attractive to rats. He trains a rat to press a lever for food, and then puts the rat into an experimental chamber containing two levers. Pressing one lever produces reinforcement on an FR 10 schedule; pressing the other lever produces reinforcement on an FI 10" schedule. Lever pressing is on a _________. a. multiple schedule b. chain schedule c. concurrent schedule d. redundant schedule

concurrent schedule

Studies of choice involve _________. a. multiple schedules b. chain schedules c. concurrent schedules d. redundant schedules

concurrent schedules

According to the ________ hypothesis, the PRE occurs because it is difficult to distinguish between intermittent reinforcment and extinction. Ans: discrimination

discrimination

When reinforcement is contingent on continuous performance of an activity, a __________. reinforcement schedule is in force. a. duration b. interval c. time d. ratio

duration

In a _____ schedule, reinforcement is contingent on the continuous performance of a behavior for some period of time. a. fixed duration b. continuous reinforcement c. fixed time d. DRH

fixed duration

The explanation of the PRE that puts greatest emphasis on internal cues is the ________ hypothesis. a. discrimination b. frustration c. sequential d. response unit

frustration

Williams found that the greater the number of reinforcements before extinction, the _______. a. greater the number of responses during extinction b. faster the rate of extinction c. stronger the response during extinction d. greater the frustration during extinction

greater the number of responses during extinction

Shirley trains a rat to press a lever and then reinforces lever presses on an FR 10 schedule when a red light is on, and an FI 10" schedule when a green light is on. In this case, lever pressing is on a _________. a. multiple schedule b. chain schedule c. concurrent schedule d. redundant schedule

multiple schedule

FT and VT are both kinds of ______ reinforcement. a. noncontingent b. intermittent c. duration-based d. continuous

noncontingent

A reduction in response rate following reinforcement is called a _________. a. post-reinforcement pause b. scallop c. latency d. rest stop

post-reinforcement pause

Post-reinforcement pauses are now often referred to as _________. a. rest periods b. pre-ratio pauses c. anticipatory pauses d. break points

pre-ratio pauses

Things are going pretty well for George (see item 26) until he jumps from reinforcing every tenth response to reinforcing every 50th response. At this point, the pigeon responds erratically and nearly stops responding entirely. George's pigeon is suffering from _________. a. ratio strain b. ratiocination c. satiation d. reinforcer deprivation

ratio strain

Resurgence may help account for _______. a. PMS b. rationalization c. regression d. reaction formation

regression

The study of reinforcement schedules suggests that the behavior we call stick-to-itiveness is largely the product of _________. a. genetics b. prenatal influences c. choice d. reinforcement history

reinforcement history

One explanation for the PRE implies that the effect is really an illusion. This is the _________. a. discrimination hypothesis b. frustration hypothesis c. sequential hypothesis d. response unit hypothesis

response unit hypothesis

The reappearance of previously effective behavior during extinction is called ____________. a. spontaneous recovery b. recovery c. resurgence d. fulfillment

resurgence

The rate at which a response occurs, once the subject begins performing it, is called the _________. a. clock rate b. walk rate c. run rate d. performance rate

run rate

A pigeon is confronted with two disks, one green, the other red. The bird receives food on a VI 20" schedule when it pecks the green disk, and on a VI 10" schedule when it pecks the red one. You predict that the bird will peck _________. a. one disk about as often as at the other b. the green disk almost exclusively c. the green disk about twice as often as the red disk d. the red disk about twice as often as the green disk

the red disk about twice as often as the green disk

In schedules research, VD stands for ________. a. video displayed b. verbal dependent c. variable dependency d. variable duration

variable duration

Bill spends his summer in the city panhandling. Every day he takes a position on a busy corner and accosts passersby saying, "Can you spare some change?" Most people ignore him, but every now and then someone gives him money. Bill's reinforcement schedule is best described as a _________. a. fixed ratio schedule b. variable ratio schedule c. fixed interval schedule d. variable interval schedule

variable ratio schedule

Harry spent his summer in the city panhandling. Every day he would sit on the sidewalk, put a cardboard sign in front of him that said, "Please help," and place his hat on the sidewalk upside down. Then he would wait. Every now and then someone would put money into his hat. Harry's reinforcement schedule is best described as a _________. a. fixed ratio schedule b. variable ratio schedule c. fixed interval schedule d. variable time schedule

variable time schedule


संबंधित स्टडी सेट्स

Artificial Intelligence and Robotics

View Set

Business Law Exam 1 WS (Topics 2 thru 12)

View Set

Chapter 10 - Infancy and Childhood Pretest

View Set