Reinforcement
The following is NOT an example of reinforcement
A dog barks when a stranger approaches.
The following is an example of reinforcement
A dog barks, the owner fills the bowl with food and barking is strengthened.
Ratio schedules of reinforcement are based on
A specific number of responses
The following is an example of reinforcement
A toddler cries, dad gives her a toy, and crying is strengthened
The result of reinforcement is that behavior is ___. a: weakened b: strengthened c: reduced d: eliminated
B. Strengthened
positive reinforcement and negative reinforcement have the same result. They both strengthen bx.
True
operant behavior is controlled by
antecedents and consequences
During ___ reinforcement, behavior is followed by the ___ of a consequence. a: negative; addition b: positive; addition c: positive; subtraction d: positive; elimination
Positive addition
Reinforcement is a ____ in which bx is followed by some ___ that increases the likelihood that hat the bx will occur again
Process, consequence
A schedule of reinforcement specified which occurrence of ___ will be reinforced
Bx
Which is an example of negative reinforcement
Close a window to stop cold air from blowing in; closing the window is strengthened.
Schedule thinning is how you move from a (an) ___ schedule of reinforcement to a (an) ___ schedule of reinforcement. a: fixed; variable b: continuous; intermittent c: intermittent; continuous d: variable; fixed
Continuous; intermittent
The following is an example of a fixed ratio schedule of reinforcement
Every time Sara types 30 words, she earns a break from work
When a reinforcer is no longer provided after a response, we call this ___. a: reinforcement b: negative reinforcement c: pleasurable d: extinction
Extinction
schedules of reinforcement is a rule specifying which occurrences of reinforcement will be scored
False
All are examples of secondary reinforcers except
Food
The following is an example of a fixed interval schedule of reinforcement
For every 60 mins that Jacob practices the violin he earns $1 from his parents
The following is an example of a variable ratio schedule of reinforcement
If Sara types an average of 30 words, she earns a break from work.
These factors influence the effectiveness of reinforcement: a: creativity, immediacy, contingency, history b: immediacy, contingency, history, magnitude, effort c: IQ, contingency, effort, magnitude, motivating operations d: How well the person understands, age history, creativity
Immediacy, contingency, history, magnitude, effort
When responses are not reinforced as often. we call this a (an): a: continuous schedule of reinforcement b: intermittent schedule of reinforcement c: no reinforcement d: undesirable reinforcement
Intermittent schedule of reinforcement
continuous reinforcement is used when teaching a new response
True
Schedule thinning is important because: a: it matches reinforcement in natural environment and makes the response resistant to extinction b: it is easier to use c: most people prefer to earn reinforcement less often d: research had proved it is more preferred by learners
It matches reinforcement in natural environment and makes the response resistant to extinction
The following is an example of a variable interval schedule of reinforcement
Jacobs parents place a dollar bill in his piggy bank when he plays his violin for an average of 60 mins per day
With extinction, __ responses are reinforced and with continuous reinforcement __ responses are reinforced. a: every; no b: no;every c: some;many d: many; some
No; every
Which is an example of positive reinforcement
Play a guitar and everyone cheers; playing a guitar is strengthened.
___ aquire reinforcing properties by being paired with __. a: primary reinforcers, seconadary reinforcers b: secondary reinforcers; primary reinforcers c: negative reinforcers; primary reinforcers d: secondary reinforcers; positive reinforcers
Secondary reinforcers; primary reinforcers
intermittent reinforcement means that ___ responses are reinforced
Some
primary reinforcers are important for ____ and do not require ____.
Survival; prior experience
To thin an interval schedule gradually increase the duration of the time interval that must elapse before the reinforcer can be delivered
T
fixed ratio and variable ratio schedules of reinforcement are both based on number of responses
T
When a schedule of reinforcement is variable, ___. a: the rule about reinforcement stays the same b: the rule about reinforcement never changes c: the rule about reinforcement changes based on an average number d: there are no rules about reinforcement
The rule about reinforcement changes based on an average number
When a schedule of reinforcement is fixed, ___. a: the rule about reinforcement always changes b: the rule about reinforcement never changes c: the rule about reinforcement sometimes changes d: there are not rules about reinforcement
The rule about reinforcement never changes
Interval schedules of reinforcement are based on
Time
All are examples of primary reinforcers except
Toys
The following is an example of reinforcement
Wanda makes chili, everyone raves about it, Wanda makes chili again
During __ reinforcement, behavior is followed by the __ of a consequence. a: negative; removal b: positive; removal c: positive; subtraction d: positive; elimination
a: negative: removal
When thinning a ratio schedule, you should ______________ increase the number of responses required for reinforcement.
gradually
The Two types of reinforcement
positive reinforcement and negative reinforcement
Two types of reinforcers
primary and secondary