Reinforcement

Ace your homework & exams now with Quizwiz!

When using reinforcement, it is important to use consequences that the client...

Prefers

Interval schedules of reinforcement are based on: A: A certain number of responses B: Time C: Consequences D: Behavior

B: Time

When thinning a ratio schedule, you should ______________ increase the number of responses required for reinforcement. A: quickly B: efficiently C: gradually D: never

C: Gradually

Which is an example of positive reinforcement? A: Close a window to stop cold air from blowing in; closing the window is strengthened. B: Cover your ears to stop the sound of a fire truck; covering your ears is weakened. C: Play a guitar and everyone cheers; playing a guitar is strengthened. D: Play a guitar and everyone cheers.

C: Play a guitar and everyone cheers; playing a guitar is strengthened.

The following is not an example of reinforcement: A: A dog barks, the owner fills the bowl with food and barking is strengthened. B: Wanda makes chili, everyone raves about it, Wanda makes chili again. C: A toddler cries, dad gives her a toy, and crying is strengthened. D: A dog barks when a stranger approaches.

D: A dog barks when a stranger approaches.

True or False: A schedule of reinforcement is a rule specifying which occurrences of reinforcement will be scored.

False

During __________________ reinforcement, behavior is followed by the ____________ of a consequence. A: negative; removal B: positive; removal C: positive; subtraction D: positive; elimination

A: negative; removal

Intermittent reinforcement means that ______________ responses are reinforced. A: some B: no C: every D: all

A: some

With extinction, ____________ responses are reinforced and with continuous reinforcement ____________ responses are reinforced. A: every; no B: no; every C: some; many D: many; some

B: no; every

During __________________ reinforcement, behavior is followed by the ____________ of a consequence. A: negative; addition B: positive; addition C: positive; subtraction D: positive; elimination

B: positive; addition

The result of reinforcement is that behavior is _________________. A: weakened B: strengthened C: reduced D: eliminated

B: strengthened

When a schedule of reinforcement is fixed, ___________. A: the rule about reinforcement always changes B: the rule about reinforcement never changes C: the rule about reinforcement sometimes changes D: there are no rules about reinforcement

B: the rule about reinforcement never changes

The following is an example of reinforcement: A: A toddler cries, dad gives her a toy, and crying is strengthened. B: A toddler cries, dad gives her a toy, and crying is eliminated. C: A toddler cries, dad takes away her toy, and crying is weakened. D: A toddler cries, dad takes away her toy, and crying is eliminated.

A: A toddler cries, dad gives her a toy, and crying is strengthened.

Which is an example of negative reinforcement? A: Close a window to stop cold air from blowing in; closing the window is strengthened. B: Close a window to stop cold air from blowing in; closing the window is weakened. C: Cover your ears to stop the sound of a fire; covering your ears is weakened. D: Play a guitar and everyone cheers; playing a guitar is strengthened.

A: Close a window to stop cold air from blowing in; closing the window is strengthened.

All are examples of secondary reinforcers except: A: Food B: Toys C: Completing a puzzle D: Money

A: Food

The following is an example of a fixed interval schedule of reinforcement: A: For every 60 minutes that Jacob practices the violin, he earns $1 from his parents. B: For every 3 songs Jacob practices on his violin, he earns $1 from his parents. C: Jacob's parents place a dollar bill in his piggy bank when he practices his violin for an average of 8 songs. D: Jacob's parents place a dollar bill in his piggy bank when he plays his violin for an average of 60 minutes each day.

A: For every 60 minutes that Jacob practices the violin, he earns $1 from his parents.

Schedule thinning is important because: A: It matches reinforcement and natural environment and makes the response resistant to extinction B: It is easier to use C: Most people prefer to earn reinforcement less often D: Research has proved it is more preferred by learners

A: It matches reinforcement in natural environment and makes the response resistant to extinction

Ratio schedules of reinforcement are based on __________. A: A Specific number of responses B: a number of consequences C: time D: the number of antecedents

A: a specific number of responses

Operant behavior is controlled by _________________________. A: antecedents and consequences B: antecedents and behavior C: good behavior and consequences D: teachers and pleasurable things

A: antecedents and consequences

_____________________ acquire reinforcing properties by being paired with ____________. A: Primary reinforcers; secondary reinforcers B: Secondary reinforcers; primary reinforcers C: Negative reinforcers; primary reinforcers D: Secondary reinforcers; positive reinforcers

B: Secondary reinforcers; primary reinforcers

All are examples of primary reinforcers except: A: Food B: Toys C: Water D: Warmth

B: Toys

A token economy uses a symbol or tokens that are earned and can then be exchanged for a __________. A: assessment B: back-up reinforcer C: token board D: None of the above

B: back-up reinforcer

Schedule thinning is how you move from a (an) __________ schedule of reinforcement to a (an) ________ schedule of reinforcement. A: fixed; variable B: continuous; intermittent C: intermittent; continuous D: variable; fixed

B: continuous; intermittent

The following is an example of reinforcement: A: A dog barks, the owner fills the bowl with food and barking is weakened. B: A dog barks, the owner fills the bowl with food and barking is strengthened. C: A dog barks, the owner fills the bowl with food and barking is eliminated. D: A dog barks, the owner fills the bowl with food and barking remains constant.

B: A dog barks, the owner fills the bowl with food and barking is strengthened.

The following is an example of a fixed ratio schedule of reinforcement: A: Every time Sara works for 30 minutes, she takes a break from work. B: Every time Sara types 30 words, she earns a break from work. C: If Sara types between 10-30 words, she earns a break from work. D: If Sara types an average of 30 words, she earns a break from work.

B: Every time Sara types 30 words, she earns a break from work.

These factors influence the effectiveness of reinforcement: A: Creativity, immediacy, contingency, history B: Immediacy, contingency, history, magnitude, effort C: IQ, contingency, effort, magnitude, motivating operations D: How well the person understands age, history, creativity

B: Immediacy, contingency, history, magnitude, effort

When responses are not reinforced as often, we call this a (an): A: Continuous schedule of reinforcement B: Intermittent schedule of reinforcement C: No reinforcement D: Undesirable reinforcement

B: Intermittent schedule of reinforcement

The two types of reinforcement are: A: Good and bad reinforcement B: Addition and subtraction reinforcement C: Positive and negative reinforcement D: Positive and unpleasant reinforcement

C: Positive and negative reinforcement

A schedule of reinforcement specifies which occurrence of _____________ will be reinforced. A: reinforcement B: food C: behavior D: the continuum

C: behavior

The two types of reinforcers are _________________ . A: first and second B: old and new C: primary and secondary D: pleasant and unpleasant

C: primary and secondary

When a schedule of reinforcement is variable, ___________. A: the rule about reinforcement stays the same B: the rule about reinforcement never changes C: the rule about reinforcement changes based on an average number D: there are no rules about reinforcement

C: the rule about reinforcement changes based on an average number

Benefits of using a token economy includes: A: It can increase the time between the target behavior and the delivery of back-up reinforcement. B: It can provide reinforcement without interruption to instruction. C: It can decrease likelihood of reinforcer satiation. D: All of the above

D: All of the above

The following is an example of a variable ratio schedule of reinforcement: A: Every time Sara works for 30 minutes, she earns a break from work. B: Every time Sara types 30 words, she earns a break from work. C: If Sara works on average for 30 minutes, she earns a break from work. D: if Sara types an average of 30 words, she earns a break from work.

D: If Sara types an average of 30 words, she earns a break from work.

The following is an example of a variable interval schedule of reinforcement: A: For every 60 minutes that Jacob practices the violin, he earns $1 from his parents. B: For every 3 songs Jacob practices on his violin, he earns $1 from his parents. C: Jacob's parents place a dollar bill in his piggy bank when he practices his violin for an average of 8 songs. D: Jacob's parents place a dollar bill in his piggy bank when he plays his violin for an average of 60 minutes each day.

D: Jacob's parents place a dollar bill in his piggy bank when he plays his violin for an average of 60 minutes each day.

The following is an example of reinforcement: A: Wanda makes chili,everyone hates it. B: Wanda makes chili, everyone likes it. C: Wanda makes chili and eats it. D: Wanda makes chili, everyone raves about it, Wanda makes chili again.

D: Wanda makes chili, everyone raves about it, Wanda makes chili again.

When a reinforcer is no longer provided after a response, we call this ___________________. A: reinforcement B: negative reinforcement C: pleasurable D: extinction

D: extinction

Reinforcement is a _______________ in which behavior is followed by some______________ that increases the likelihood that the behavior will occur again. A: consequence; process B: process; antecedent C: behavior; antecedent D: process; consequence

D: process; consequence

Primary reinforcers are important for ______________________ and do not require ________. A: negative reinforcement; teaching B: reinforcement; survival C: learning; teaching D: survival; prior experience

D: survival; prior experience

Preference is:

Different for all people

If you choose to use a consequence that is not desirable to the client, you will...

Not strengthen behavior

True or False: A back-up reinforcer is a primary or secondary reinforcer selected by the client.

True

True or False: Continuous reinforcement is used when teaching a new response.

True

True or False: Fixed ratio and variable ratio schedules of reinforcement are both based on number of responses.

True

True or False: Positive reinforcement and negative reinforcement have the same result. They both strengthen behavior.

True

True or False: To thin an interval schedule, gradually increase the duration of the time interval that must elapse before the reinforcer can be delivered.

True


Related study sets