Schedules of Reinforcement

¡Supera tus tareas y exámenes ahora con Quizwiz!

Concurrent schedules of reinforcement

The different schedules of reinforcement that exist all at the same time

A

The term "limited" is best described as: A) a limit on the amount of time the individual has to respond after a reinforcer becomes available B) a limit on the number of behaviors a person must exhibit to access reinforcers C) a limit on the amount of time before reinforcers become available D) none of the above

Continuous Reinforcement

an arrangement in which every correct response (behavior) is reinforced

C

Caryn praises her daughter every fifth time she brings home an A on an assignment. What schedule of reinforcement is this? A) fixed interval B) variable interval C) Fixed ratio D) variable ratio

B

A student in a special education classroom receives a reinforcer (token) after every five correct responses on her math worksheet. This is an example of what schedule of reinforcement? A) continuous reinforcement B) fixed ratio C) variable ration D) fixed interval

Interval Schedule with Limit Hold

Good Behavior Game If timer goes off and the kids engage in cooperative behavior, they can earn a reinforcement (5 minutes of TV). If fighting and the kids are fighting, they will lose 5 minutes. The limit hold in this is 0 seconds....behavior must occur RIGHT when the timer goes

Concurrent Factors

Immediacy of reinforcement Magnitude of reinforcement Type of schedules that are operating Response Efforts

B

In a ____________ reinforcement schedule, the reinforcer is delivered for the first response following an average of X amount of time. A) fixed interval B) variable interval C) fixed ratio D) variable ratio

C

In which schedule of reinforcement does a reinforcer get delivered when a response occurs after a period of time? A) Ratio Schedule B) Continuous Reinforcement Schedule C) Interval schedule D) Discontinuous

C

Nicole's teacher praises her for every math problem that she completes. This is an example of: A) intermittent B) fixed C) continuous D) interval

Limit Hold

finite number of time after a reinforcer becomes available that a response will produce it (produce a deadline) EX: waiting for a bus, you know when the bus comes at relatively fixed intervals (every 12 minutes) but the reinforcement (getting on the bus) is only there for a limited amount of time. If you are not there before the hold is expired, you do not get the reinforcement (getting on the bus) Useful when you want to produce ratio like behavior, but dont have time to count behaviors

Matching Law

response rate in a concurrent schedule that is proportional to the rate of reinforcement of that activity relative to the rates of reinforcement on the order of concurrent activities What activity is going to provide the most reinforcement??

Schedule of Reinforcement

rule specifying which occurrences of a given behavior, if any, will be reinforced

Fixed Ratio

Delivery of reinforcement is based on number of responses emitted SAME EVERY SINGLE TIME Reinforcement occurs after a specific number of responses are emitted EX: If Jan gets 2 math problems correct, she gets reinforcement. You would gradually increase the schedule

Problems with Simple Interval Schedules

Rarely used in BMod. Produce long post reinforcement pauses, it generates lower rates of responding and you have to continuously monitor behavior traits after the interval has expired

Intermittent Reinforcement

an arrangement in which a behavior is reinforced only occasionally rather than every time is occurs ex: raising a hand

Fixed Interval

provides reinforcement for the first correct response made after a specified time interval amount of time that must elapse before reinforcement become available EX: praise a child every five minutes for working on their math problems Interval times and have to emit the behavior Learn to pause after each reinforcement and begin to respond again only at the time interval approaches Gradual increase in the rate of responding with responding occurring at a high rate before reinforcement is available. No responding occurs for some time after the reinforcement Tend to produce a scallop shape There is a Post Reinforcement Pause, which is longer when the interval schedule is larger. EX: watching TV and flipping to another channel during the commercials, your response will increase with frequency as you get closer to the reinforcement time (the show coming back on), assuming commercials are the same length

B

A telemarketer has to make a certain number of calls before a sale is made. However, the telemarketer does not know the exact number that will be required in order that make a sale. This is an example of _________ schedule of reinforcement. A) fixed ratio B) variable ratio C) fixed interval D) variable interval

Fixed Ratio

After a response is reinforced, no responding occurs for a period of time. Responding occurs at a high, steady rate until next reinforcer is delivered. Produce rapid, steady response rates as long as the number of responses required is not too large There is a post reinforcement pause after the reinforcement is emitted. Higher the value, longer the pause. Schedule is resistant to extinction. Might experience ratio strain -- deteriorating of responding ex: pigeon peak at a disk 4 times and get a food pallet

A

All the reinforcement schedules that are in effect for a person's behavior at one time are referred to as __________ schedules of reinforcement A) concurrent B) overlapping C) simultaneous D) mixed

Interval Schedules

Fixed Interval and Variable Interval Reinforcer is delivered after a period on time has passed

Ratio Schedules

Fixed Ratio and Variable Ratio

Concurrent Schedules

Most situations, we have the option of behaving in a variety of ways Each of the behaviors are likely reinforced on a different schedule of reinforcement. EX: At night, I have the option to 1) study for my classes, 2) watch TV, 3) mess around on the Internet, or 4) go to bed early

Variable Interval

Reinforcement is available after a variable amount of time has elapsed. Length of interval varies around average # Responses on schedule tend to occur at a slow but steady rate NO Post reinforcement pause EX: If you call a friend and the line is busy, your redial behavior is likely on this type. You don't know when your friend will be off the phone (reinforcement is available to you) so you call again every few minutes Produce a moderate, steady rate and very small post reinforcement pauses

Variable Ratio

Reinforcement is provided after a variable number of correct responses. Varies around an average # So, the reinforcement may come after an average of 10 responses but may come after 1, 3, 19, or 17. Produce a high rate of responding with almost no post reinforcement pausing Ex: gambling slot machines; asking someone out. Schedule is very resistant to extinction Tricky to implement

Why Intermittent Schedules Better?

Reinforcer remains effectively longer b/c satiation takes place more slowly Tend to take longer to extinguish Individuals work more consistently Behavior is more persistent in the maintenance phase after reinforcers have been faded

B

Tom's supervisor tells him that for every 10 cars Tom sells he will get a $500 bonus. This is an example of _________ reinforcement schedule. A) Fixed interval B) Fixed ratio C) variable interval D) variable ratio

C

Which schedule of reinforcement produces the highest rate of responding: A) fixed ratio B) fixed interval C) variable ratio D) variable interval


Conjuntos de estudio relacionados

Ch 35: Communication and Teaching with Children and Families

View Set

Sentence Grammar Diagnostic Pre-Test English 102 Mantooth

View Set

Storage, Input and Output Devices

View Set

Pediatric Growth and Development 23-27 & 42

View Set