Cooper ch 13
Variable ratio schedule effects
Consistency, steady rates of response. Quick rate of response
schedule effects
after reinforcement a postreinforcement pause occurs. After the pause the ratio requirement is completed with a high rate of respons and very little hesitation between responses. The size of the ratio influences both the pause and the rate.
Interval schedule of reinforcement with a limited hold
reinforcement remains availoable for a finite time following the elapse of the FI or VI interval. target response must occur after the first interval of time and before a certain amount of time has elapsed.
FR schedules of reinforcment
often produce high rates of response
Ratio strain
A behavioral effect associated with abrupt increases in ratio requirements when moving from denser to thinner reinforcement schedules; common effects include avoidance, aggression, and unpredictable pauses or cessation in responding.
fixed ratio
A fixed number of responses must be emitted before reinforcement occurs
Maintenance of behavior
A lasting behavior change
fixed interval scallop effect
A lower rate of response occurs until a specific time gets closer. The response rate will increase, the closer we get to the specific time.
why does FI produce a pause and scallop effect
A. to discriminate the elapse of time and B. that responses emitted right after a reinforced response are never reinforced.
Fixed interveal schedule effects
FI schedules generate slow to moderate rates of response wit a pause in the responding following reinforcement. Responding begins to accelerate toward the end of the interval
Variable Ratio
Most resistant. A schedule where reinforcement happens after a varied number of responses
Variable interval schedules
In operant conditioning, a reinforcement schedule that reinforces a response at unpredictable time intervals
fixed interval schedule of reinforcement
In operant conditioning, a reinforcement schedule that reinforces a response only after a specified time has elapsed
Schedule effects of the VR
Ratio requirements are completed with a very high rate of response and little hesitation between response. Posrenforcement pauses are not characteristic of the VR schedule. Rate of response is influenced by the size of the ratio requirements.
continuous reinforcement
Reinforcing the desired response every time it occurs. (establish and strengthen behavior)
VI schedule effects
Slow to moderate response rate that is constant and stable, few or no post reinforcement pauses
postreinforcement pause
The absence of responding for a period of time following reinforcement; an effect commonly produced by fixed interval (FI) and fixed ratio (FR) schedules of reinforcement.
differential reinforcement of low rates
Uses positive reinforcement to differentiate or separate appropriate student behavior from inappropriate behavior by increasing one while decreasing the other.
schedule thinning
changing a contingency reinforcement by gradually increasing the response ratio or the extent of the time interval
intermittent schedules of reinforcement
contingency of reinforcement in which some, but not all, occurrences of the behavior produce reinforcement (Maintain behavior)
Schedules of reinforcment
the rule for determining when and how often reinforcers will continue; Four types of schedules: fixed ratio, variable ratio, fixed interval, and variable interval; interval means over a time and ratio means an act; partial reinforcement is on a variable schedule whereas continuous reinforcement is on a fixed schedule; variable schedules are more effective in learning
Ratio schedule of reinforcement
A partial reinforcement schedule in which the organism is reinforced based upon the number of instances of the desired behavior. There can be fixed ratio schedules or variable ratio schedules.
variable schedule of reinforcement
A partial schedule of reinforcement in which the reinforcer is delivered after either varying numbers of responses (a variable ratio schedule), or varying lengths of time (a variable interval schedule).
Interval schedule of reinforcement
A pattern of reinforcement in which a reinforcer is received if the desired response occurs (at least once) after a specified length of time has elapsed since the last reinforcement.
fixed schedule of reinforcement
A schedule in which reinforcement occurs after a consistent, or fixed number of responses (fixed ratio), or after a consistent amount of time (fixed interval).
differential reinforcement of high rates
Reinforcing only those responses within a response class that meet a specific criterion along some dimensions(s) (i.e., frequency, topography, duration, latency, or magnitude) and placing all other responses in the class on extinction.
Tic-tac-toe VR procedure
establish a maximum number for the individual or group. The larger the masimum number selected, the greater the odds against meeting the contingency. 1 chance out of 100 has less chance of being selected than 1chance out of 20. 2. the teacher gives the individual or the group a tic tac toe grid. 3 students fill in each square of the grid with a number no greater than the maximum number. 4. Teacher fills a container with numbered slips of paper (numbers no higher than the maximum number. Each number should be included several times. 5. contingent on the occurrence of the target behavior students withdraw one slip of paper from the box. if the number on the paper corresponds with one of the numbers on the grid the student marks out that number on the grid. 6 The reinforcer is delivered when students have marked out three numbers in a row.