Schedules of Reinforcement, Contingency Contracting, Token Economies, and Self-Management

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

Variable Ratio Schedule of Reinforcement

- Response ratio or time requirement can change - Variable Ratio - 4 (VR 4). An average of every 4th correct occurrence - Variable Interval 2 minute (VI 2) Reinforcing the first occurrence after an averaged elapsed time of 2 minutes - Example: Slot machines operate on a variable schedule of reinforcement. Many behaviors we exhibit daily are on a variable schedule of reinforcement (ex. reinforcement for a job well done at work by our boss, promotions, a raising your hand in class... Use in applied settings: often implemented without a planned and systematic approach (the reinforcer is delivered by chance, hit or miss). But this is not an effective use of VR schedules.

Progressive Schedules of Reinforcement

- Systematically thins each successive reinforcement opportunity independent of the participant's behavior 1) Progressive Ratio Schedules of Reinforcement (PR) 2) Progressive Interval Schedules of Reinforcement (PI) - Progressive schedules of reinforcement provide an assessment procedure for identifying reinforcers that will maintain treatment effects across increasing schedule requirements. -During the session, progressive schedules are typically thinned to the "breaking point" when the participant stops responding. Comparing the breaking points and corresponding number of responses associated with each reinforcer can identify relative reinforcement effects. This is important because working in an applied setting we want to identify a schedule of reinforcement that will maintain the desired behavior, but we also want to thin the schedule as much as possible to make it comparable to the happenings in the real world...

Explanation of FI Graph

- The learner begins to discriminate the elapse of time and that responses emitted right after a reinforced response are never reinforced. Extinction during the early part of the interval might account for the postreinforcement pause. We have seen that fixed interval and fixed ratio both produce this postreinforcement pause. However the responses in a FR schedule are emitted at a consistent rate until completing the ratio requirement, whereas responses emitted under a FI schedule begin at a slow rate and accelerate toward the end of the interval

Fixed Schedule

- The response ratio or the time requirement remains constant - Fixed Ratio 4 (FR 4) - Reinforcement is delivered after every 4th correct response - Fixed Interval 2 min (FI 2) - Reinforcement is delivered for the first response after the 2 minutes have elapsed

1) Compound Schedule 1B) Discriminative Schedules of Reinforcement: Multiple Schedules of Reinforcement

- There is one class of behavior (example: hand raising in class) a discriminative stimulus for each contingency in effect (teacher A and Teacher B) and different conditions for reinforcement (Teacher A reinforces with a smile and "Nice Job" when a hand is raised and a correct answer is given, Teacher B challenges the answer). - ONE CLASS OF BEHAVIOR (ex. Hand raising, saying hi). -A CUE FOR THE CONTINGENCY IN EFFECT. AND A DIFFERENT SCHED OF REINFORCEMENT ASSOCIATED WITH THE DIFFERENT CUES. - Sd present as long as the schedule is in effect

Limited Hold

- Can be added to an interval schedule - Reinforcement remains available for a small period of time following the elapse of the FI or VI interval - The participant will lose the opportunity to receive reinforcement if the target response does not occur within the time limit. - Ex. A FI 3 min schedule with a 30 sec limited hold - the first correct response following the elapse of 3 min is reinforced, but only if it occurs within 30sec after the end of the 3 min interval -Limited holds typically do not change the overall -response characteristics of FI and VI schedules, beyond a possible increased rate of response. -Example - reinforcing in seat behavior on a VI schedule. The is a small window in which the student may be able to receiving the reinforcement so by adding in a limited hold it increases the amount of time they can receive it. It also can work the other way in that you can limit the amount of time a person can access the reinforcer (Ex. Store promotional ads. You can get a free bagel after you buy 10 bagels...but this free bagel is only available until 12/31/30)

Fixed Ratio and Schedule Effects

- Consistency of Performance - Produces a typical pattern of responding - Post-reinforcement pause follows reinforcement * Size of the ratio influences the duration of the postreinforcement pause * After the first response of the ratio requirement that participant completes the required responses with little hesitation between responses and then following reinforcement being delivered there is a postreinforcement pause (the participant does not respond for a period of time following the reinforcement) - High rates of response * Larger the ratio requirement, the higher the rate of response * Example: If the ratio requirement is that you need to do 50 jumping jacks before you can have a powerade you may do them faster than if you only had to do 25 to get the powerade.

Fixed Interval (FI)

- First correct response following a fixed duration of time is reinforced - Elapse of time alone is not sufficient for reinforcer delivery * Ex, FR4 - the first response emitted after the 4 min mark will be reinforced. * The reinforcer is available after the fixed time interval has elapsed, and it remains available until the first response. *Ex. this schedule is rare to find operating in the real world. We don't encounter many fixed interval schedules. mail delivery example (mail delivered at noon each day. Only trips walking out to the mailbox after noon will be reinforced - in cooper) * In applied settings it is easy to set a schedule. Use of the motivator is helpful. EXAMPLE -targeting conversational exchanges. The first exchange made after 2 minutes is reinforced.

Variable Interval (VI) Schedule of Reinforcement

- First correct response following the elapse of variable durations of time is reinforced - "Average" amount of time - The intervals between reinforcement delivery vary at random.

Intermittent Schedules of Reinforcement (INT)

- Intermittent schedules of reinforcement fall in between continuous reinforcement and extinction. - Used to strengthen established behaviors. - Assists in progression to naturally occurring reinforcement

Variable Interval (VI) Graph Explained

- It is unpredictable when the reinforcement will be delivered. Pop quizzes are a good example - when they will happen is unpredictable, lead to a steady rate of studying throughout the course - Example - Isaac appropriate mealtime behavior (using utensils) - The variable interval schedules low to moderate rates of responding. The length of the interval may effect responding. The longer the interval the lower the overall rate of response. - This schedule can also be used when providing noncontingent reinforcement

Maintenance of Behavior

- Maintenance - use continuous reinforcement to teach and then fade to intermittent reinforcement. Don't cut positive reinforcement cold turkey! - Goal it to eventually have the behavior exhibited and reinforced with naturally occurring activities or stimuli. - Ex. exercising because it is enjoyable and you like it rather than to earn a freeze pop when you are finished. Intermittent reinforcement is usually necessary for the progression to naturally occurring reinforcement.

Premack Principle

- Making the opportunity to engage in a behavior that occurs at a relatively high free operant rate contingent on the occurrence of low frequency behavior will function as reinforcement for the low frequency behavior - Informally known as "Grandma's law"

Compound Schedules: (1) Concurrent Schedule of Reinforcement

- Occurs when: (a) two or more contingencies of reinforcement (b) operate independently and simultaneously (c) for two or more behaviors - Advantage: Often gives the individual a choice. Uses: 1) Reinforcer Assessment *2 choices presented *2 contingencies of reinforcement *2 behaviors 2) Behavioral/Academic Intervention and Teaching * Arranging two or more reinforcers for the participant to choose from contingent upon the occurrence of a target behavior * Example: want to increase social interaction at lunch or dinner time. You provide one reinforcer for eating the meal seated by one's self at a desk and then a different reinforcer for sitting with their peers at the dinner table. By changing the magnitude or quality of the reinforcer for sitting at the table, you should see the student allocate more of their eating times to the table with their peers rater than eating alone.

Basic Schedules of Reinforcement (Ratio and Interval)

- Ratio schedules require a number of responses before one response produces reinforcement - Interval schedules require an elapse of time before a response produces reinforcement. *Reinforcement is contingent only on the occurrence of one response after the required time has elapsed. Ex. you are reinforcing correct answers on a 3 min fixed interval schedule. 3 minutes goes by and the next answer gets reinforced. You do not provide the reinforcement at the 2 min mark, but at the first correct answer after 3 minutes has elapsed.

Continuous Reinforcement (CRF)

- Reinforcement for every occurrence of a behavior - Some behaviors naturally produce this schedule - Advantageous for skill acquisition

Thinning Intermittent Reinforcement

-Two methods commonly used: 1) Gradually increasing the response ratio or the duration of the time interval 2) Providing instructions such as rules, directions and signs to communicate the schedule of reinforcement Example: FI 3-min LH 30 sec *Changes to the schedules should be small while thinning. *The instructions may enhance the effectiveness of interventions when participants are told what performances produce reinforcement. - Behavioral characteristics include: avoidance, -aggression, and unpredictable pauses in responding -RATIO STRAIN: A result of abrupt increases in ratio requirements * Moving from denser to thinner reinforcement schedules. *When you see theses behavior changes and suspect ratio strain you should reduce the ratio requirement, then gradually thin ratio requirements after recovering the behavior. *Ratio strain will also occur when the ratio becomes so large that the reinforcement can not maintain the response level (ex. Shoveling long driveway) or response requirement exceeds the participants physiological capabilities.

3 Types of Compound Schedules of Reinforcement

1) Concurrent 2) Discriminative 3) Nondiscriminative

Concurrent Performances: Formalizing the Matching Law

1) Concurrent ratio schedules =responses primarily allocated to the ratio that produces the higher rate of reinforcement * (conc VR VR or conc FR FR) participants are sensitive to the ratio schedules and tend to maximize reinforcement by responding primarily to the ratio that produces the higher rate of reinforcement 2) Concurrent interval schedules =responses not exclusive to the richer schedule * - they distribute their responding to the two schedules to match or approximate the proportion of reinforcement that is actually obtained on each independent schedule.

Variable Interval (VI) Schedule Effects

1) Consistency of Performance 1A) Constant, stable rate of response 1B) Typically produces few hesitations between responses 2) Rate of responding 2A) Low to moderate rate of response 2B) The larger the average interval, the lower the overall rate of response

Fixed Interval (FI) Schedule Effects

1) Consistency of Performance 1A) Produces a postreinforcement pause in responding during the early part of the interval. 1B) An accelerated rate of response is seen at the end of the interval. This is called an FR scallop. 2) Rate of Response 2A) Slow to moderate rate of response 2B) The larger the fixed interval requirement, the longer the postreinforcement pause (The longer the fixed interval requirement the longer the postreinforcement pause and the lower the overall rate of response). Example: studying habits: college students studying for an exam or preparing a final project. They get the date a few months in advance, but the prep and studying does not begin at the beginning of the course but a few weeks before the due date. The studying and prep response accelerates as the date of the exam gets closer.

Schedules of Reinforcement

1) Continuous Reinforcement 2) Intermittent Reinforcement 3) Extinction

4 types of basic reinforcement schedule

1) Fixed Ratio 2) Variable Ratio 3) Fixed Interval 4) Variable Interval

Variable Ratio and Schedule Effects

1) Schedule Effects 1A) Consistency of Performance a) Produce consistent, steady rates of response b) Do not produce a postreinforcement pause 2) Rate of Response 2A) Quick rate of response 2B) The larger the ratio requirement, the quicker the rate of response

Compound Schedules of Reinforcement

Combined elements of: 1) Continuous reinforcement (CRF) 2) Intermittent schedules of reinforcement (FR, VR, FI, VI) 3) Differential reinforcement of various rates of responding (DRH, DRL) 4) Extinction (CRF)

Compound Schedules of Reinforcement: Matching Law

Matching Law = Rate of responding typically is proportional to the rate of reinforcement received from each choice alternative

Variations of Intermittent Schedules

Reinforcing differential rates of responding: 1) Differential reinforcement of high rates (DRH) 2) Differential reinforcement of low rates (DRL)


Set pelajaran terkait

English CAWT 115 Unit 6 Chapter 13 & 14 Quiz

View Set

Practice Custom 3 banks Med Surge 2

View Set

CH 21 Respiratory Care Modalities (E2)

View Set

Prep U Chapter 34: Assessment and Management of Patients with Inflammatory Rheumatic Disorders

View Set

Old Testament History--Summer: 1 Samuel 1-17 (Scatter Game Style)

View Set

General Insurance, completing the application, underwriting and delivering the policy.

View Set

2500 entrepreneurship smartbook chp 1 questions

View Set