Chapter 5
Williams found that the greater the number of reinforcements before extinction, the ____. A. greater the number of responses during extinction B. faster the rate of extinction C. stronger the response during extinction D. greater the frustration during extinction
A. greater the number of responses during extinction
The training procedure Thorndike used in his famous experiments with cats is best described as ____. A. free operant B. discrete trial C. trial-and-error D. field research
B. discrete trial
Often the initial effect of an extinction procedure is an increase in the behavior called a/an extinction ____. A. rebound B. resurgence C. burst D. flyer
C. burst
Resurgence may help account for ____. A. PMS B. rationalization C. regression D. reaction formation
C. regression
Pearl's resistance to learning English is most likely the result of ____. A. a lack of intelligence B. age. Studies show that after the age of 60 learning a second language is nearly impossible. C. the length of time she has spent speaking her native language D. the benefits she receives for not speaking English
D. the benefits she receives for not speaking English
T/F: Negative reinforcement and punishment are synonyms.
FALSE
T/F: A general assumption of behavioral research is that any feature of a behavior may be strengthened by reinforcement, so long as reinforcement can be made contingent on that feature.
TRUE
T/F: According to Skinner, people are rewarded, but behavior is reinforced.
TRUE
T/F: Another term for operant is instrumental.
TRUE
T/F: In operant learning, the word contingency usually refers to the degree of correlation between a behavior and a consequence.
TRUE
T/F: Negative reinforcement increases the strength of a behavior.
TRUE
T/F: Operant learning probably always involves Pavlovian conditioning as well.
TRUE
T/F: People can learn to behave randomly provided that reinforcers are made contingent on random acts.
TRUE
T/F: Positive reinforcement increases the strength of a behavior.
TRUE
T/F: Reinforcement is often said to increase the frequency of a behavior, but research suggestss that any feature of a behavior (e.g., intensity, duration, form, etc.) can be strengthened if a reinforcer can be made contingent on that feature.
TRUE
T/F: Reprimands, restraint, captivity, and electrical shocks can be reinforcers.
TRUE
T/F: Vomiting is ordinarily an involuntary response, but sometimes it can be modified by operant procedures.
TRUE
_____ is a neurotransmitter that seems to be important in reinforcement. a. Dopamine b. Stupamine c. Intelamine d. Actomine
a. Dopamine
________ demonstrated that electrical stimulation of the brain could be reinforcing. a. Olds and Milner b. Skinner c. Barnes and Noble d. Hull
a. Olds and Milner
Thorndike complained that _______ evidence provided a "supernormal psychology of animals." a. anecdotal b. case study c. informal experimental d. intuitive
a. anecdotal
Studies of delayed reinforcement document the importance of ______. a. contiguity b. contingency c. inter-trial interval d. deprivation level
a. contiguity
Thorndike plotted the results of his puzzle box experiments as graphs. The resulting curves show a _____ with succeeding trials. a. decrease in time b. decrease in errors c. change in topography d. increase in the rate of behavior
a. decrease in time
An action that improves the effectiveness of a reinforcer is called a ______. a. motivating operation b. reward booster c. contrived reinforcer d. activator
a. motivating operation
The Watson and Rayner experiment with Little Albert may have involved operant as well as Pavlovian learning because the loud noise ______. a. occurred as Albert reached for the rat b. occurred while Albert was eating c. did not bother Albert initially d. was aversive
a. occurred as Albert reached for the rat
Mary decides to try to modify Pearl's behavior (see above item). She and the rest of the family refuse to respond to any comment or request by Pearl that they know she is capable of expressing in English. For example, if during dinner she says, "Pass the potatoes" in English, she gets potatoes; if she says it in her native language she gets ignored. The procedure being used to change Pearl's behavior is ______. a. positive reinforcement b. negative reinforcement c. adventitious reinforcement d. punishment
a. positive reinforcement
The one thing that all reinforcers have in common is that they _______. a. strengthen behavior b. are positive c. feel good d. provide feedback
a. strengthen behavior
Skinner describes some of his most important research in _______. a. Verbal Behavior b. The Behavior of Organisms c. Particulars of My Life d. Animal Intelligence
b. The Behavior of Organisms
34. The best title for the figure below is ______. a. Motivation and Line Drawing b. The Effect of Practice without Reinforcement c. Trial and Error Learning d. Improvement in Line Drawing with Practice
b. The Effect of Practice without Reinforcement
According to the one-process theory of avoidance, the avoidance response is reinforced by _______. a. escape from the CS b. a reduction in the number of aversive events c. positive reinforcers that follow aversive events d. non-contingent aversives
b. a reduction in the number of aversive events
The law of effect says that _______. a. satisfying consequences are more powerful than annoying consequences b. behavior is a function of its consequences c. how an organism perceives events is more important than the events themselves d. effective behavior drives out ineffective behavior
b. behavior is a function of its consequences
Secondary reinforcers are also called _______ reinforcers. a. transient b. conditioned c. second-order d. acquired
b. conditioned
The number of operant procedures indicated in the contingency square is ______. a. two b. four c. six d. nine
b. four
Schlinger and Blakely found that the reinforcing power of a delayed reinforcer could be increased by ________. a. increasing the size of the reinforcer b. preceding the reinforcer with a stimulus c. providing a different kind of reinforcer d. following the reinforcer with a stimulus
b. preceding the reinforcer with a stimulus
Clark Hull's explanation of reinforcement assumes that reinforcers _____. a. stimulate the brain b. reduce a drive c. activate neurotransmitters d. leave a neural trace
b. reduce a drive
Sylvia believes that the reinforcement properties of an event depend on the extent to which it provides access to high probability behavior. Sylvia is most likely an advocate of _______ theory. a. drive-reduction b. relative value c. response deprivation d. random guess
b. relative value
Premack's name is most logically associated with _______. a. drive reduction theory b. relative value theory c. response deprivation theory d. equilibrium theory
b. relative value theory
The level of deprivation is less important when the reinforcer used is a(n) _________ reinforcer. a. primary b. secondary c. unexpected d. intrinsic
b. secondary
The distinctive characteristic of the Sidman avoidance procedure is that _______. a. the aversive event is signaled b. the aversive event is not signaled c. the aversive event is signaled twice d. there is no aversive event
b. the aversive event is not signaled
Charles Catania identified three characteristics that define reinforcement. These include all of the following except _______. a. a behavior must have a consequence b. the consequence of the behavior must be positive c. a behavior must increase in strength d. the increase in strength must be the result of the behavior's consequence
b. the consequence of the behavior must be positive
In one of Thorndike's puzzle boxes, a door would fall open when a cat stepped on a treadle, thus allowing the cat to reach food outside the box. Eventually the cat would step on the treadle as soon as it was put into the box. Thorndike concluded that ________. a. the reasoning ability of cats is quite remarkable b. treadle stepping increased because it had a "satisfying effect" c. the treadle is a CS for stepping d. learning meant connecting the treadle with freedom and food
b. treadle stepping increased because it had a "satisfying effect"
Donald Zimmerman found that a buzzer became a positive reinforcer after it was repeatedly paired with ______. a. food b. water c. escape from shock d. morphine
b. water
The three-term contingency is often represented by the letters ____. A. S-O-R B. S-R-B C. ABC D. BOC
c. ABC
The Premack principle says that reinforcement involves _______. a. a reduction in drive b. an increase in the potency of a behavior c. a relation between behaviors d. a satisfying state of affairs
c. a relation between behaviors
E. L. Thorndike's studies of learning started as an attempt to understand _______. a. operant conditioning b. the psychic reflex c. animal intelligence d. maze learning
c. animal intelligence
Negative reinforcement is also called _______. a. punishment b. aversive training c. escape-avoidance training d. reward training
c. escape-avoidance training
Money is a good example of a _______ reinforcer. a. primary b. tertiary c. generalized d. transient
c. generalized
Operant learning is sometimes called ________ learning. a. free b. higher-order c. instrumental d. reward
c. instrumental
The opposite of a conditioned reinforcer is a ______. a. tertiary reinforcer b. secondary reinforcer c. primary reinforcer d. generalized reinforcer
c. primary reinforcer
According to ___________ theory, schoolchildren are eager to go to recess because they have been deprived of the opportunity to exercise. a. drive-reduction b. relative value c. response deprivation d. stimulus substitution
c. response deprivation
The reappearance of previously effective behavior during extinction is called ____. A. spontaneous recovery B. recovery C. resurgence D. fulfillment
c. resurgence
Thorndike made important contributions to all of the following fields except _____. a. educational psychology b. animal learning c. social psychology d. psychological testing
c. social psychology
Thorndike emphasized that we learn mainly from _______. a. errors b. repeated trials c. success d. social experiences
c. success
________ gave Skinner's experimental chamber the name, "Skinner box." a. Fred Keller b. E. L. Thorndike c. John Watson d. Clark Hull
d. Clark Hull
The author of your text calls Skinner the ______. a. Newton of psychology b. Thorndike of free operant work c. discoverer of reinforcement d. Darwin of behavior science
d. Darwin of behavior science
All of the following are recognized kinds of reinforcers except ______. a. primary b. contrived c. secondary d. classical
d. classical
Thorndike's 1898 dissertation describes experiments with cats, chicks, and ____. A. mice B. rats C. monkeys D. dogs
d. dogs
Operant learning may also be referred to as _______. a. trial-and-error learning b. effects learning c. non-Pavlovian conditioning d. instrumental learning
d. instrumental learning
Alan Neuringer demonstrated that with reinforcement, _____ could learn to behave randomly. a. preschoolers b. cats c. rats d. pigeons
d. pigeons
Positive reinforcement is sometimes called _______. a. escape training b. positive training c. satisfier training d. reward learning
d. reward learning
Mary's grandmother, Pearl, is from the Old Country. Although she knows some English, she continues to speak her native tongue. Pearl can't go anywhere without a member of the family because she can't communicate with people about prices, directions, bus routes, etc. Pearl's resistance to learning English is most likely the result of ______. a. a lack of intelligence b. age. Studies show that after the age of 60 learning a second language is nearly impossible. c. the length of time she has spent speaking her native language d. the benefits she receives for not speaking English
d. the benefits she receives for not speaking English
Douglas Anger proposed that there is a signal in the Sidman avoidance procedure. The signal is ________. a. reinforcement b. the aversive event c. fatigue d. time
d. time