B-05 Define & Provide Examples of Schedules of Reinforcement - Part 3
A child selling cookies door-to-door will most likely result in selling being reinforced on
a VR schedule.
Reinforcement provided for the first response after a constant length of time is
fixed interval (FI).
Reinforcement after a set number of responses is
fixed ratio (FR).
A salaried employee gets paid every 2 weeks by direct deposit. This reinforcement schedule is most like
fixed time (FT).
A planned probability that a behavior will be reinforced
is a schedule of reinforcement.
Continuous reinforcement means that reinforcement is
provided after each response
Compound schedules of reinforcement include
sequences of simple schedules.
Intermittent schedule of reinforcement means that
some but not all responses are reinforced.
Reinforcement provided for the first response after a period of time, where the period of time varies around a specific average is
variable interval (VI).
Reinforcement after an average of a certain number of responses is
variable ratio (VR).
Simple schedules of reinforcement include
variable ratio.
You push the unlock button on your car key, and the door unlocks every time. What schedule is this?
CRF
Simple schedules of reinforcement include
(all of the above) - fixed ratio. - variable ratio. - fixed interval.
Compound schedules of reinforcement include
(all of the above) - 1. fixed interval schedules combined with extinction. 2. sequences of simple schedules. 3.. simultaneous schedules.
You are changing channels on your TV and find that if you hit the button too fast, the channel doesn't change; but if you slow down to the rate of once a second, the channels advance with every button press. What schedule is this?
FI