ABA Exam 2

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

What does the rat learn to do when the tone is presented?

After a number of instances in which the tone is presented just before the shock, the rat starts to jump to the other side of the chamber as soon as it hears the tone. The tone is the warning stimulus; the rat avoids the shock by jumping to the other side as soon as the warning stimulus is presented.

Concurrent Schedules of Reinforcement

All of the schedules of reinforcement that are in effect for a person's behaviors at one time. Concurrent schedules of reinforcement (and punishment) for the different response options at a particular time influence the probability that a particular behavior will occur at that time.

Positive Punishment (EO and AO)

Any event or condition that enhances the aversiveness of a stimulus event makes that event a more effective punisher (EO), whereas events that minimize the aversiveness of a stimulus event make it less effective as a punisher (AO). For example, some drugs (e.g., morphine) minimize the effectiveness of a painful stimulus as a punisher. Other drugs (e.g., alcohol) may reduce the effectiveness of social stimuli (e.g., peer disapproval) as punishers.

Operant Behavior

Behavior that is strengthened through the process of reinforcement - behavior that is controlled by its consequences An operant behavior acts on the environment to produce a consequence and, in turn, is controlled by, or occurs again in the future as a result of, its immediate consequence.

Extinction Influences Ex.

if you put money into a vending machine and push the button, you always get the item you want. This is a case of continuous reinforcement, and the decrease in behavior during extinction would be fairly rapid. You would not continue to put money into a vending machine if you no longer got the item you paid for; the lack of reinforcement would be immediately apparent. Contrast this with what happens when you put money into a slot machine or a video gambling machine.

Punisher (Aversive Stimulus)

is a consequence that makes a particular behavior less likely to occur in the future. The stimulus that follows the behavior that results in a decrease in the future probability of the behavior

Satiation

when a person has recently consumed a large amount of a particular reinforcer (such as food or water) or has had substantial exposure to a reinforcing stimulus. As a result, these reinforcers are less potent at that time. For example, your favorite music may be less reinforcing if you have listened to it for the past 5 hours. Likewise, adult attention may be less reinforcing to a child who has just received substantial one-on-one attention from a teacher. Although substantial exposure to, or consumption of, a reinforcer decreases the effectiveness of a reinforcer, the effects of satiation diminish over time. The longer it has been since the reinforcer was consumed, the more powerful the reinforcer becomes.

Variable Ratio (VR) Schedule

•Reinforcer after X number of responses on the average •Produces high and steady rate / no post-reinforcement pause delivery of a reinforcer is based on the number of responses that occur, but in this case, the number of responses needed for reinforcement varies each time, around an average number. a reinforcer is delivered after an average of x responses.

Fixed Interval (FI)

•Reinforcer for the first response after X amount of time - time interval does not change •Produces low rate of responding that increases at the end of the interval •Rarely used in behavior modification The interval of time is fixed, or stays the same each time. For example, in a fixed interval 20-second (FI 20-second) schedule of reinforcement, the first response that occurs after 20 seconds has elapsed results in the reinforcer. Responses that occur before the 20 seconds are not reinforced; they have no effect on the subsequent delivery of the reinforcer (i.e., they don't make it come any sooner). Once the 20 seconds has elapsed, the reinforcer is available, and the first response that occurs is reinforced. Then, 20 seconds later, the reinforcer is available again, and the first response that occurs produces the reinforcer.

Ex. of Abolishing Operations

Food is not likely to be reinforcing right after a person just finished a large meal. Having just eaten a large meal is an AO that makes food less reinforcing at that time and makes the behavior of getting food less likely to occur. Water or other drinks are not likely to be reinforcing for someone who just drank a substantial amount of water. Drinking a large amount of water makes water less reinforcing at that time and makes the behavior of getting water less likely to occur. These events are called abolishing operations because they (a) decrease or abolish the effectiveness of a reinforcer at a particular time or in a particular situation and (b) make the behavior that results in that reinforcer less likely to occur.

Extinction Influences

Two important factors influence the extinction process: the reinforcement schedule before extinction and the occurrence of reinforcement after extinction. The reinforcement schedule partly determines whether extinction results in a rapid decrease in the behavior or a more gradual decrease. The reinforcement schedule before extinction -In continuous reinforcement, behavior decreases rapidly once the reinforcement is terminated. -In intermittent reinforcement, behavior often decreases more gradually once the reinforcement is terminated. •Produces resistance to extinction: behavior persists once extinction is implemented. The occurrence of reinforcement after extinction. -Takes longer for the behavior to decrease.

Punisher Ex.

"Juan teases and hits his sisters until they cry. His mother scolds him and spanks him each time he teases or hits his sisters. Although Juan stops teasing and hitting his sisters at the moment that his mother scolds him and spanks him, he continues to engage in these aggressive and disruptive behaviors with his sisters day after day." The scolding and spanking do not function as punishers. They have not resulted in a decrease in Juan's problem behavior over time. This example actually illustrates positive reinforcement. Juan's behavior (teasing and hitting) results in the presentation of a consequence (scolding and spanking by his mother and crying by his sisters), and the outcome is that Juan continues to engage in the behavior day after day. These are the three parts of the definition of positive reinforcement. This raises an important point about the definition of punishment. You cannot define punishment by whether the consequence appears unfavorable, unpleasant, or aversive. You can conclude that a particular consequence is punishing only if the behavior decreases in the future. In Juan's case, scolding and spanking appear to be unfavorable consequences, but he continues to hit and tease his sisters. If the scolding and spanking functioned as a punisher, Juan would stop hitting and teasing his sisters over time. When we define punishment (or reinforcement) according to whether the behavior decreases (or increases) in the future as a result of the consequences, we are adopting a functional definition. One other point to consider is whether a behavior decreases or stops only at the time the consequence is administered, or whether the behavior decreases in the future. Juan stopped hitting his sisters at the time that he received a spanking from his mother, but he did not stop hitting his sisters in the future. Some parents continue to scold or spank their children because it puts an immediate stop to the problem behavior, even though their scolding and spanking do not make the child's problem behavior less likely to occur in the future. The parents believe they are using punishment. However, if the behavior continues to occur in the future, the scolding and spanking do not function as punishers and may actually function as reinforcers.

Motivating operations have two effects:

(a) they alter the value of a reinforcer and (b) they make the behavior that produces that reinforcer more or less likely to occur at that time. ■ An EO makes a reinforcer more potent and makes a behavior that produces the reinforcer more likely. ■ An AO makes a reinforcer less potent and makes a behavior that produces that reinforcer less likely.

Extinction

1. A behavior that has been previously reinforced 2. No longer results in the reinforcing consequences 3. and, therefore, the behavior stops occurring in the future. *When a behavior stops occurring because it is no longer reinforced, we say that the behavior has undergone extinction or that the behavior has been extinguished. *BUT, as long as a behavior is reinforced, even intermittently, it will continue to occur.

Punishment

1. The occurrence of a behavior 2. Is followed immediately by a consequence 3. The behavior is less likely to occur in the future (decrease in the future probability of the behavior)

Positive Punishment

1. The occurrence of a behavior 2. is followed by the presentation of an aversive stimulus 3. and as a result, the behavior is less likely to occur in the future. •Following the behavior •A stimulus (punisher) is applied or presented •The behavior is less likely to occur in the future •6 month old with rumination •Used small amount of lemon juice as punisher •As a result, rumination immediately decreased and infant gained weight

Negative Punishment

1. The occurrence of a behavior 2. is followed by the removal of a reinforcing stimulus 3. and as a result, the behavior is less likely to occur in the future. •Following the behavior •A stimulus (reinforcer) is withdrawn or removed •The behavior is less likely to occur in the future

Escape and Avoidance Ex.

A child might run away or hide from a parent who is about to spank the child. Sometimes people learn to lie to avoid punishment, or learn to avoid the person who delivers the punishing stimulus. When implementing a punishment procedure, you have to be careful that inappropriate escape and avoidance behaviors do not develop.

Distinguishing between escape and avoidance behavior.

A laboratory rat is placed in an experimental chamber that has two sides separated by a barrier; the rat can jump over the barrier to get from one side to the other. On the floor of the chamber is an electric grid that can be used to deliver a shock to one side or the other. Whenever the shock is presented on the right side of the chamber, the rat jumps to the left side, thus escaping from the shock. Jumping to the left side of the chamber is escape behavior because the rat escapes from an aversive stimulus (the shock). When the shock is applied to the left side, the rat jumps to the right side. The rat learns this escape behavior rather quickly and jumps to the other side of the chamber as soon as the shock is applied. In the avoidance situation, a tone is presented just before the shock is applied. (Rats have better hearing than vision.)

Token

A neutral stimulus such as a plastic poker chip or a small square piece of colored cardboard can be used as a conditioned reinforcer (or token) to modify human behavior in a token reinforcement program. In a token reinforcement program, the token is presented to the person after a desirable behavior, and later the person exchanges the token for other reinforcers

Conditioned Punisher

A previously neutral stimulus is paired with an established punisher, becomes a punisher itself.

Fixed Interval (FI) Ex.

An FI 30-minute schedule would be in effect if the supervisor came by once every 30 minutes and gave Paul a token for the first response (packaging a part) that occurred. The number of parts that Paul packaged throughout the 30 minutes would be irrelevant. The supervisor would provide the token (reinforcer) for the first part that she saw Paul package after the 30-minute interval. This is different from an FR or VR schedule, in which Paul gets a token for the number of parts he packages. In an FI schedule, only one response is needed for reinforcement, but it must occur after the interval. What Ferster and Skinner (1957) found is that FI schedules of reinforcement produced a certain pattern of responding: The pigeon made an increasing number of responses near the end of the interval, up until the reinforcer was delivered. After that, there was a pause in responding; as the end of the interval approached, the pigeon again started responding more quickly until the reinforcer was delivered. We might expect to see the same pattern of behavior with Paul in the factory. After he receives the token from the supervisor and the supervisor walks away (to observe other workers), Paul may slow down or stop working for a while, and then start working again as the end of the 30 minutes approaches. Because he receives a token for packaging a part only after the 30-minute interval has ended, his behavior of packaging parts naturally starts to occur more frequently as the end of the interval approaches. Because he never receives a token for packaging parts during the 30-minute interval, his behavior naturally starts to occur less frequently in the early part of the interval.

When the behavior produces a reinforcing consequence through direct contact with the physical environment, the process is automatic reinforcement.

An example of automatic positive reinforcement would be if you went to the kitchen and got the chips for yourself. An example of automatic negative reinforcement would be if you got the remote and turned down the volume on the TV yourself. In both cases, the reinforcing consequence was not produced by another person.

When a behavior produces a reinforcing consequence through the actions of another person, the process is social reinforcement.

An example of social positive reinforcement might involve asking your roommate to bring you the bag of chips. An example of social negative reinforcement might involve asking your roommate to turn down the TV when it is too loud. In both cases, the consequence of the behavior was produced through the actions of another person.

Stimulus

An object or event that can be detected by one of the senses, and thus has the potential to influence the person (stimuli is the plural form of the word stimulus). The object or event may be a feature of the physical environment or the social environment (the behavior of the person or of others).

Novel Behaviors

Behaviors that do not typically occur in a particular situation. A characteristic of an extinction burst is that novel behaviors may occur for a brief period when a behavior is no longer reinforced.

Time-out from Positive Reinforcement and Response Cost (Negative Punishment)

Both involve the loss of a reinforcing stimulus or activity after the occurrence of a problem behavior. Time out: -The person is removed from a reinforcing situation for a brief period of time after the problem behavior occurs. Response Cost: -Contingent on a behavior, a specified amount of reinforcer is removed.

Time-Out (Negative Punishment) Ex.

Clark, Rowbury, Baer, and Baer (1973) used time-out to decrease aggressive and disruptive behavior in an 8-year-old girl with Down syndrome. In time-out, the person is removed from a reinforcing situation for a brief period after the problem behavior occurs. Each time the girl engaged in the problem behavior in the classroom, she had to sit by herself in a small time-out room for 3 minutes. As a result of time-out, her problem behaviors decreased immediately. Through the use of time-out, the problem behavior was followed by the loss of access to attention (social reinforcement) from the teacher and other reinforcers in the classroom.

Fixed Ratio (FR) Schedule Ex.

Consider the example of Paul, a 26- year-old adult with severe intellectual disability who works in a factory packaging parts for shipment. As the parts come by on a conveyor belt, Paul picks them up and puts them into boxes. Paul's supervisor delivers a token (conditioned reinforcer) after every 20 parts that Paul packages. This is an example of an FR 20. At lunch and after work, Paul exchanges his tokens for backup reinforcers (e.g., snacks or soft drinks). An FR schedule could be used in a school setting by giving students reinforcers (such as stars, stickers, or good marks) for correctly completing a fixed number of problems or other academic tasks. Piece-rate pay in a factory, in which workers get paid a specified amount of money for a fixed number of responses (e.g., $5 for every 12 parts assembled), is also an example of an FR schedule.

Positive Punishment Ex.

Corte, Wolf, and Locke (1971) helped institutionalized adolescents with intellectual disabilities decrease selfinjurious behavior by using punishment. One subject slapped herself in the face. Each time she did so, the researchers immediately applied a brief electric shock with a handheld shock device. (Although the shock was painful, it did not harm the girl.) As a result of this procedure, the number of times she slapped herself in the face each hour decreased immediately from 300-400 to almost zero. (Electric shock is rarely, if ever, used as a punisher today because of ethical concerns. This study is cited to illustrate the basic principle of positive punishment, not to support the use of electric shock as a punisher.) This is an example of positive punishment because the painful stimulus was presented each time the girl slapped her face, and the behavior decreased as a result. Sajwaj, Libet, and Agras (1974) also used positive punishment to decrease life-threatening rumination behavior in a 6-month-old infant. Rumination in infants involves repeatedly regurgitating food into the mouth and swallowing it again. It can result in dehydration, malnutrition, and even death. In this study, each time the infant engaged in rumination, the researchers squirted a small amount of lemon juice into her mouth. As a result, the rumination behavior immediately decreased, and the infant began to gain weight

Negatively Reinforcing Punishment Ex.

Dr. Hopkins hated it when her students talked in class while she was teaching. Whenever someone talked in class, Dr. Hopkins stopped teaching and stared at the student with her meanest look. When she did this, the student immediately stopped talking in class. As a result, Dr. Hopkins's behavior of staring at students was reinforced by the termination of the students' talking in class. Dr. Hopkins used the stare frequently, and she was known all over the university for it.

Extinction Ex.

Each evening when Greg gets home from work, he goes into his apartment building through the emergency exit because that door is close to his apartment and he doesn't have to walk all the way around to the front door. The apartment manager doesn't want people to use this door except in emergencies, so she installs a new lock on the door. That day, when Greg gets home from work, he turns the doorknob but the door doesn't open. He turns the knob again, but nothing happens. He starts turning the knob harder and pulling harder on the door, but still nothing happens. Eventually he stops and walks to the front door. Greg tries the door again the next couple of days when he gets home from work, but still it will not open. Finally, he quits trying to go in through the emergency doo

Continuous Reinforcement

Each response is followed by the reinforcer

Reinforcer Ex.

Ex. The child cried at night when her parents put her to bed. The child's crying was an operant behavior. The reinforcer for her crying was the parents' attention. Because crying at night resulted in this immediate consequence (reinforcer), the child's crying was strengthened: She was more likely to cry at night in the future.

Schedules of Reinforcement

Fixed Ratio, Variable Ratio, Fixed Interval, Variable Interval

Ex. of Establishing Opertaions

Food is a more powerful reinforcer for a person who hasn't eaten recently. Not having eaten in a while is an EO that makes food more reinforcing at that time and makes the behavior of getting food more likely to occur. Likewise, water is a more potent reinforcer for someone who has not had a drink all day or who just ran 6 miles. Water or other beverages are more reinforcing when a person just ate a large amount of salty popcorn than when a person did not. (That is why some bars give you free salty popcorn.) In these examples, going without food or water (deprivation), running 6 miles, and eating salty popcorn are events called establishing operations because they (a) increase the effectiveness of a reinforcer at a particular time or in a particular situation and (b) make the behavior that results in that reinforcer more likely to occur.

Contingency

For punishment to be most effective, the punishing stimulus should occur every time the behavior occurs. When the response produces the consequence and the consequence does not occur unless the response occurs first, we say that a contingency exists between the response and the consequence. When a contingency exists, the consequence is more likely to reinforce the response (e.g., see Borrero, Vollmer & Wright, 2002). Consider the example of turning the key in your ignition to start your car. This is an example of contingency: Every time you turn the key, the car starts. The behavior of turning the key is reinforced by the engine starting. If the engine started only sometimes when you turned the key, and if it started sometimes when you did not turn the key, the behavior of turning the key in this particular car would not be strengthened very much. . A person is more likely to repeat a behavior when it results in a consistent reinforcing consequence. That is, a behavior is strengthened when a reinforcer is contingent on the behavior (when the reinforcer occurs only if the behavior occurs).

Reinforcement can involve the addition of a reinforcer (positive reinforcement) or the removal of an aversive stimulus (negative reinforcement) following the behavior.

In both cases, the behavior is strengthened. For both positive and negative reinforcement, the behavior may produce a consequence through the actions of another person or through direct contact with the physical environment.

B. F. Skinner conducted numerous studies on the principle of reinforcement in laboratory animals such as rats and pigeons.

In experiments with rats, Skinner placed the animal in an experimental chamber and delivered a pellet of food each time the rat pressed a lever located on one of the walls of the chamber. At first, the rat explored the box by moving around, sniffing, climbing up on its hind legs, and so on. When it happened to press the lever with one of its paws, the device automatically delivered a pellet of food to an opening in the wall. Each time the hungry rat pressed the lever, it received a pellet of food. Thus, the rat was more likely to press the lever each time it was placed in the chamber. This one behavior, pressing the lever, was strengthened because when it occurred, it was immediately followed by the receipt of food. The behavior of pressing the lever increased in frequency relative to all the other behaviors the rat had exhibited when put in the chamber.

Aversive Stimulus (Negative Reinforcer)

In negative reinforcement, the stimulus that is removed or avoided after the behavior is called an aversive stimulus. Often is seen as something unpleasant, painful, or annoying that a person will try to get away from or avoid.

Positive Reinforcer Vs. Negative Reinforcer

In positive reinforcement, a response produces a stimulus (a positive reinforcer), whereas in negative reinforcement, a response removes or prevents the occurrence of a stimulus (an aversive stimulus). In both cases, the behavior is more likely to occur in the future. The mother's behavior of buying her child candy results in termination of the child's tantrum (an aversive stimulus is removed). As a result, the mother is more likely to buy her child candy when he tantrums in a store. This is an example of negative reinforcement. On the other hand, when the child tantrums, he gets candy (a positive reinforcer is presented). As a result, he is more likely to tantrum in the store. This is an example of positive reinforcement.

Earliest demonstration of reinforcement

was reported by Thorndike in 1911

Negative Punishment Ex.

Johnny interrupts his parents and the behavior is reinforced by his parents' attention. (They scold him each time he interrupts.) In this case, extinction would involve withholding the parents' attention each time Johnny interrupts. Negative punishment would involve the loss of some other reinforcer—such as allowance money or the opportunity to watch TV—each time he interrupted. Both procedures would result in a decrease in the frequency of interrupting.

Intermittent Reinforcement

Not every response is followed by a reinforcer. Putting money into the slot machine is only occasionally reinforced by hitting the jackpot and winning money from the machine. If the machine was broken and never again produced a jackpot (no reinforcement), you might put many more coins into the machine before finally giving up. It takes longer for the gambling behavior to stop because it is more difficult to determine that there is no longer reinforcement for the behavior. Intermittent rei

Positive Reinforcer

Often is seen as something pleasant, desirable, or valuable that a person will try to get.

Spontaneous Recovery Ex.

Once in a while, Amanda may cry at night long after extinction, but if she gets no attention for the crying, it will not occur often or for very long. However, if spontaneous recovery occurs and the behavior is now reinforced, the effect of extinction will be lost. For example, Greg may still try occasionally to open the back door to his apartment building. If the door happens to open one day, his behavior of using that door will be reinforced, and he will be more likely to try to use that door again. Finding the door open occasionally would be an example of intermittent reinforcement, which would increase behavioral persistence or resistance to extinction in the future.

Response

One instance of a behavior

Procedures of Extinction

The behavior decreases and stops occurring. Procedurally, however, extinction is slightly different in the two cases. If a behavior is positively reinforced, a consequence is applied or added after the behavior. Therefore, extinction of a positively reinforced behavior involves withholding the consequence that was previously delivered after the behavior. To put it another way, when the behavior no longer results in the delivery of the reinforcing consequence, the behavior no longer occurs.

Reinforcement vs. Punishment

Reinforcement increases behavior, Punishment decreases behavior.

Fixed Ratio (FR) Schedule

Reinforcer after X number of responses - the number does not change Produces high rate / post reinforcement pause a specific or fixed number of responses must occur before the reinforcer is delivered. That is, a reinforcer is delivered after a certain number of responses. For example, in a fixed ratio 5 (FR 5) schedule, the reinforcer follows every fifth response. In an FR schedule, the number of responses needed before the reinforcer is delivered does not change.

Negatively Reinforced Behaviors Ex.

Shandra has an 11 P.M. curfew. If she comes in later than 11 P.M., her parents scold her, lecture her, and ground her for a week. Because the parents go to bed at 10 P.M., they do not know what time their daughter comes home. They ask her the next morning, and if she came home after 11 P.M., she lies and tells them she was home earlier. Lying is negatively reinforced by the avoidance of aversive consequences from her parents. Extinction of lying would occur if lying no longer helped her to avoid aversive consequences. Thus, if a parent were awake in bed and knew when Shandra came home, she would not avoid aversive consequences by lying. As a result, she would quit lying when she got home late.

Conditioned Reinforcers Ex.

Sight, sound, and scent of parents Parents' smile, tone of voice, attention, praise Types of toys, TV shows, music, clothes, activities Grades, positive evaluations Accomplishments (social, physical) Money Others

Motivating Operations (MOs)

Some events can make a particular consequence more or less reinforcing at some times than at other times. There are two types of MOs; establishing operations and abolishing operations.

Negative reinforcement is not punishment

Some people confuse negative reinforcement and punishment (see Chapter 6). They are not the same. Negative reinforcement (like positive reinforcement) increases or strengthens a behavior. Punishment, in contrast, decreases or weakens a behavior. The confusion comes from the use of the word negative in negative reinforcement. In this context, the word negative does not mean bad or unpleasant, but simply refers to the removal (subtraction) of the stimulus after the behavior.

Law of Effect

States that a behavior that produces a favorable effect on the environment will be more likely to be repeated in the future. Thorndike placed a hungry cat in a cage and put food outside of the cage where the cat could see it. Thorndike rigged the cage so that a door would open if the cat hit a lever with its paw. The cat was clawing and biting the bars of the cage, reaching its paws through the openings between the bars, and trying to squeeze through the opening. Eventually, the cat accidentally hit the lever, the door opened, and the cat got out of the cage and ate the food. Each time Thorndike put the hungry cat inside the cage it took less time for the cat to hit the lever that opened the door. Eventually, the cat hit the lever with its paw as soon as Thorndike put it in the cage.

Generalized Conditioned Punisher Ex.

Stimuli that are associated with the loss of reinforcers may become conditioned punishers. A parking ticket or a speeding ticket is associated with the loss of money (paying a fine), so the ticket is a conditioned punisher for many people. In reality, whether speeding tickets or parking tickets function as conditioned punishers depends on a number of factors, including the schedule of punishment (how likely is it that you will get caught speeding?) and the magnitude of the punishing stimulus (how big is the fine?). These and other factors that influence the effectiveness of punishment are discussed later in this chapter. A warning from a parent may become a conditioned punisher if it has been paired with the loss of reinforcers such as allowance money, privileges, or preferred activities. As a result, when a child misbehaves and the parent gives the child a warning, the child may be less likely to engage in the same misbehavior in the future. A facial expression or look of disapproval may be a conditioned punisher when it is associated with the loss of attention or approval from an important person (such as a parent or teacher). A facial expression may also be associated with an aversive event such as a scolding or a spanking, and thus may fudnction as a conditioned punisher).

Misconceptions about Extinction Ex.

Suppose that a child runs from the table whenever he is told to eat his vegetables, and the outcome is that he does not eat his vegetables. If the parents ignore this behavior, it will not stop. Running from the table is reinforced by escape from eating the vegetables. Ignoring the behavior does not withhold this reinforcer and, therefore, does not function as extinction.

AO Ex.

Suppose that your friend had some tickets for events at an amusement park you were about to attend. If you were told that the tickets had expired and were no longer being accepted, the reinforcing value of the tickets would be lost and you would be less likely to ask your friend for the tickets. Sunshine probably is not aversive for most people, but when a person has a bad sunburn, escape from the heat of the sun is more reinforcing. Therefore, the bad sunburn is an establishing operation that makes staying indoors or sitting in the shade more reinforcing because these behaviors avoid or terminate the heat of the sun (aversive stimulus). On the other hand, applying sunscreen may be an abolishing operation that decreases the aversiveness of being in the sunshine and makes escape from the sun less reinforcing.

Backup Reinforcers

Tangible objects, activities, or privileges that serve as reinforcers and that can be purchased with tokens.

Escape Behavior (Negative Reinforcement)

The behavior results in the termination of (escape from) the aversive stimulus and the behavior is strengthened. The occurrence of the behavior results in the termination of an aversive stimulus that was already present when the behavior occurred. That is, the person escapes from the aversive stimulus by engaging in a particular behavior, and that behavior is strengthened.

Reinforcer

The consequence (stimulus or event) that follows operant behavior & strengthens operant behavior. A stimulus or event that increases the future probability of a behavior when it occurs contingent on the occurrence of the behavior.

Premack Principle Ex.

The Premack principle operates when parents require their fourth-grade son to complete his homework before he can go outside to play with his friends. The opportunity to play (a high-probability behavior) after the completion of the homework (low-probability behavior) reinforces the behavior of doing homework; that is, it makes it more likely that the child will complete his homework. A 6-year-old boy with developmental disability stop engaging in aggressive behavior. Each time the boy hit someone in the classroom, he was required to stand up and sit down on the floor ten times in a row. This punishment procedure, called contingent exercise, resulted in an immediate decrease in the hitting behavior.

Negatively Reinforced Behaviors

The aversive stimulus is no longer removed after the behavior. If a behavior is negatively reinforced, the behavior results in the removal or avoidance of an aversive stimulus. Extinction of a negatively reinforced behavior therefore involves eliminating the escape or avoidance that was reinforcing the behavior. When the behavior no longer results in escape from or avoidance of an aversive stimulus, the behavior eventually stops.

Spontaneous Recovery

The natural tendency for the behavior to occur again in situations that are similar to those in which it occurred before extinction.

Positively Reinforced Behaviors

The positive reinforcer is no longer delivered after the behavior.

Positive Punishment vs. Negative Punishment

They are both types of punishment, therefore, they both weaken behavior. The only difference is whether a stimulus is added (positive punishment) or removed (negative punishment) following the behavior. Think of positive as a plus or addition (+) sign and negative as a minus or subtraction (-) sign. In þ punishment, you add a stimulus (an aversive stimulus) after the behavior. In (-) punishment, you subtract or take away a stimulus (a reinforcer) after the behavior. If you think of positive and negative in terms of adding or subtracting a stimulus after the behavior, the distinction should be clearer.

Consequences

The stimulus or event occurring immediately after a behavior.

Variable Ratio (VR) Schedule Ex.

The supervisor could reinforce his work performance on a VR 20 schedule by delivering a token after an average of 20 parts that Paul packages. Sometimes the number of responses needed would be less than 20 and sometimes more than 20. The number of responses needed for any particular token delivery would not be predictable to Paul, in contrast with the FR 20 schedule, where the token is provided after every 20 responses (packaged parts). Another common example of a VR schedule is the slot machine found in casinos. The response of putting a coin in the machine and pulling the handle is reinforced on a VR schedule. The gambler never knows how many responses are needed for a jackpot (the reinforcer). However, the more responses the gambler makes, the more likely a jack pot is (because a VR schedule is based on number of responses, not on time or some other factor). Therefore, the VR schedule in a slot machine produces high, steady rates of responding. Of course, the casino makes sure that the VR schedule is such that gamblers put more money in the machine than the machine pays out as reinforcers. One other example of a VR schedule can be found in the salesperson who must make calls (in person or on the phone) to sell products. The number of calls that must occur before a sale (the reinforcer) occurs is variable. The more calls the salesperson makes, the more likely it is that a sale will result. However, which call will result in a sale is unpredictable

Variable Interval (VI) Schedule Ex.

Using a VI 30-minute schedule, the supervisor would come around at unpredictable intervals of time (e.g., after 5 minutes, 22 minutes, 45 minutes, 36 minutes) and give Paul a token for the first part that she saw Paul package. The various intervals of time would average 30 minutes. The reinforcer (token) would be given for the first response after the interval. On a VI 30-minute schedule, Paul probably would package parts more steadily throughout the day. The slowing down and speeding up of his work rate observed on the FI 30-minute schedule would not occur because the length of the intervals is unpredictable.

Misconceptions about Extinction

Using extinction simply means ignoring behavior, but extinction means removing the reinforcer for a behavior. ignoring the problem behavior functions as extinction only if attention is the reinforcer.

Reinforcement vs. Punishment Ex.

When Kathy reached over the fence, this behavior was followed immediately by the presentation of an aversive stimulus (the dog bit her). The dog's bite served as a punisher: Kathy was less likely to reach over the fence in the future. However, when Kathy pulled her hand back quickly, she terminated the dog bite. Because pulling her hand back removed the pain of being bitten, this behavior was strengthened. This is an example of negative reinforcement. As you can see, when the dog bite was presented after one behavior, the behavior was weakened; when the dog bite was removed after another behavior, that behavior was strengthened.

Extinction Burst Ex.

When Mark pushes the on button on the remote control for his TV set and it does not turn on the TV (because the batteries are dead), he pushes it longer (increased duration) and harder (increased intensity) before he finally gives up. His behavior of pushing the on button was not reinforced by the TV turning on; therefore, he quit trying, but not until he tried pushing it longer and harder (extinction burst). Each night, 4-year-old Amanda cried at bedtime for 10-15 minutes, and her parents came to her room and talked to her until she fell asleep. By doing so, her parents were accidentally reinforcing her crying. After talking to a Board Certified Behavior Analyst, the parents decided not to go into her room or talk to her when she cried at bed time. The first night, she cried for 25 minutes before falling asleep. By the end of the week, she quit crying at all at bedtime. When they stopped going to her room after she cried, the parents were using extinction. The increase in crying duration the first night is an extinction burst. When Amanda screams and cries louder, her parents may come into the room and give her the attention she wasn't getting for simply crying. The extinction burst is not necessarily a conscious process, however. Amanda probably is not thinking, "I'll cry louder, scream, and hit my pillow to get my parents' attention." The extinction burst is simply a natural characteristic of an extinction situation.

Immediacy

When a punishing stimulus immediately follows a behavior, or when the loss of a reinforcer occurs immediately after the behavior, the behavior is more likely to be weakened. That is, for punishment to be most effective, the consequence must follow the behavior immediately. As the delay between the behavior and the consequence increases, the effectiveness of the consequence as a punisher decreases. Ex. A student makes a sarcastic comment in class and the teacher immediately gives her an angry look. As a result, the student is less likely to make a sarcastic comment in class. If the teacher had given the student an angry look 30 minutes after the student made the sarcastic comment, the look would not function as a punisher for the behavior of making sarcastic comments. Instead, the teacher's angry look probably would have functioned as a punisher for whatever behavior the student had engaged in immediately before the look.

Automatic Reinforcement

When the behavior produces a reinforcing consequence through direct contact with the physical environment

Social Reinforcement

When the behavior produces a reinforcing consequence through the actions of another person

Misconceptions about Punishment

Whenever behavior analysts speak of punishment, they are referring to a process in which the consequence of a behavior results in a future decrease in the occurrence of that behavior. This is quite different from what most people think of as punishment. In general usage, punishment can mean many different things, most of them unpleasant. Many people define punishment as something meted out to a person who has committed a crime or other inappropriate behavior. Authority figures such as governments, police, churches, or parents impose punishment to inhibit inappropriate behavior—that is, to keep people from breaking laws or rules. Punishment may involve prison time, a death sentence, fines, the threat of going to hell, spanking, or scolding. However, the everyday meaning of punishment is quite different from the technical definition of punishment used in behavior modification. People who are unfamiliar with the technical definition of punishment may believe that the use of punishment in behavior modification is wrong or dangerous.

Premack Principle

Which states that when a person is made to engage in a low-probability behavior contingent on a high-probability behavior, the high-probability behavior will decrease in frequency. That is, if, after engaging in a problem behavior, a person has to do something he or she doesn't want to do, the person will be less likely to engage in the problem behavior in the future. One type of positive reinforcement involves the opportunity to engage in a high-probability behavior (a preferred behavior) as a consequence for a low probability behavior (a less-preferred behavior), to increase the low-probability behavior.

EO Ex.

You have just bought a new table for your computer and printer. When you read the assembly instructions and discover that you need a screwdriver to assemble it, this increases the value of a screwdriver as a reinforcer at that time. As a result, you are more likely to go look for a screwdriver. Searching for a screwdriver is strengthened by finding it and successfully assembling the table. For example, pennies are not potent reinforcers for most people. However, if you were told that there was a copper shortage and that pennies were now worth 50 cents apiece, the reinforcing value of pennies would increase, and you would be more likely to engage in behavior that resulted in obtaining more pennie. Sunshine probably is not aversive for most people, but when a person has a bad sunburn, escape from the heat of the sun is more reinforcing. Therefore, the bad sunburn is an establishing operation that makes staying indoors or sitting in the shade more reinforcing because these behaviors avoid or terminate the heat of the sun (aversive stimulus).

Concurrent Operant

a number of different behaviors or response options are concurrently available for the person. For example, raising his hand in class and making animal noises are concurrent operants for a first grader. Each is likely to be reinforced by teacher attention on some schedule of reinforcement.

Extinction (Ferster and Skinner 1957)

demonstrated the principle of extinction with laboratory animals. When the pigeon in the experimental chamber no longer received food as a reinforcer for pecking the key, the pigeon's keypecking behavior stopped. When the laboratory rat no longer received food pellets for pressing the lever, the lever-pressing behavior decreased and eventually stopped.

Negative Punishment (EO and AO)

deprivation is an EO that makes the loss of reinforcers more effective as a punisher and satiation is an AO that makes the loss of reinforcers less effective as a punisher. For example, telling a child who misbehaves at the dinner table that dessert will be taken away will: (a) be a more effective punisher if the child has not eaten any dessert yet and is still hungry (EO), (b) be a less effective punisher if the child has had two or three helpings of the dessert already and is no longer hungry (AO). Losing allowance money for misbehavior will: (a) be a more effective punisher if the child has no other money and plans to buy a toy with the allowance money (EO), (b) be a less effective punisher if the child has recently received money from other sources (AO).

Novel Behavior Ex

when Amanda's parents no longer reinforced her crying at night, she cried longer and louder (increased duration and intensity), but she also screamed and hit her pillow (novel behaviors). In the first example, Rae not only pushed the button on the coffee machine repeatedly when the coffee didn't come out, but she also pushed the coin return button and shook the machine (novel behaviors)

The longer the delay between the response and the consequence,

less effective the consequence will be because the contiguity or connection between the two is weakened. If the time between the response and the consequence becomes too long and there is no contiguity, the consequence will not have an effect on the behavior. For example, if you wanted to teach your dog to sit on command and you gave the dog a treat 5 minutes after it performed the behavior, the treat would not function as a reinforcer for sitting. In this case, the delay would be too long. Rather, the treat would function as a reinforcer for whatever behavior the dog engaged in immediately before receiving the treat (probably begging, which is the behavior usually reinforced with treats). Conversely, if you gave the dog a treat immediately after it sat, the treat would reinforce sitting behavior, and the dog would be more likely to sit in the future when given the corresponding command. For example, if you tell a joke and people laugh, you are more likely to repeat the joke in the future. If you don't get immediate laughs, you will be less likely to tell the joke in the future.

Abolishing Operation (AO)

makes a reinforcer less potent (it abolishes or decreases the effectiveness of a reinforcer)

Establishing Operation (EO)

makes a reinforcer more potent (it establishes the effectiveness of a reinforcer)

Deprivation

n is a type of establishing operation that increases the effectiveness of most unconditioned reinforcers and some conditioned reinforcers. A particular reinforcer (such as food or water) is more powerful if a person has gone without it for some time. For example, attention may be a more powerful reinforcer for a child who has gone without attention for a period of time.

Avoidance Behavior (Negative)

the behavior results in the prevention of (avoidance of) the aversive stimulus and the behavior is strengthened. The occurrence of the behavior prevents an aversive stimulus from occurring. That is, the person avoids the aversive stimulus by engaging in a particular behavior, and that behavior is strengthened. In an avoidance situation, a warning stimulus often signals the occurrence of an aversive stimulus, and the person engages in an avoidance behavior when this warning stimulus is present. Both escape and avoidance are types of negative reinforcement; therefore, both result in an increase in the rate of the behavior that terminated or avoided the aversive stimulus.

Effects of Reinforcement on behavior

• Increase in frequency • Increase in duration • Increase in intensity • Increase in quickness (decrease in latency)

Conditioned Reinforcers

•A previously neutral stimulus - repeatedly paired with an established reinforcer (an unconditioned or conditioned reinforcer) - will function as a reinforcer A conditioned reinforcer (also called a secondary reinforcer) is a stimulus that was once neutral (a neutral stimulus does not currently function as a reinforcer; i.e., it does not influence the behavior that it follows) but became established as a reinforcer by being paired with an unconditioned reinforcer or an already established conditioned reinforcer. For example, a parent's attention is a conditioned reinforcer for most children because attention is paired with the delivery of food, warmth, and other reinforcers many times in the course of a young child's life. Money is perhaps the most common conditioned reinforcer.

Positive Reinforcement

•Behavior is followed by the presentation of a stimulus (a reinforcer) and the behavior is strengthened 1. The occurrence of a behavior 2. is followed by the addition of a stimulus (a reinforcer) or an increase in the intensity of a stimulus, 3. which results in the strengthening of the behavior

Negative Reinforcement

•Behavior is followed by the removal of a stimulus (a punisher / aversive stimulus) and the behavior is strengthened 1. The occurrence of a behavior 2. is followed by the removal of a stimulus (an aversive stimulus) or a decrease in the intensity of a stimulus, 3. which results in the strengthening of the behavior

Unconditioned Reinforcers

•Biologically determined - survival value for the individual •Food, water, human contact (warmth), oxygen, sexual contact, escape from cold, heat, pain, extreme levels of stimulation because they function as reinforcers the first time they are presented to most human beings; no prior experience with these stimuli is needed for them to function as reinforcers. Unconditioned reinforcers sometimes are called primary reinforcers. These stimuli are unconditioned reinforcers because they have biological importance

Unconditioned Punishers

•Events that have biological importance •Punishers that require no conditioning to be effective Ex. Extreme heat or cold, extreme levels of auditory or visual stimulation, or any painful stimulus (e.g., from electric shock, a sharp object, or a forceful blow) naturally weakens the behavior that produces it.

Factors that Influence Reinforcement

•Immediacy •Consistency (contingency) •Motivating operations -EO vs. AO •Individual differences •Intensity of the stimulus

Reinforcement

•Is a basic principle of behavior •Was established by Skinner in laboratory research and over 40 years of human research •Is a component of many behavior modification procedures 1. The occurrence of a behavior 2. Results immediately in a consequence 3. The behavior is strengthened (more likely to occur again in the future in similar circumstances) •Present ----> Behavior is followed by a consequence •Future ----> Behavior is more likely to occur ** Functional definition of reinforcement

Generalized Conditioned Reinforcer

•Paired with a wide variety of other reinforcers •Money, praise, tokens Money is a generalized conditioned reinforcer because it is paired with (exchanged for) an almost unlimited variety of reinforcers. As a result, money is a powerful reinforcer that is less likely to diminish in value (to become satiated) when it is accumulated. That is, satiation (losing value as a reinforcer) is less likely to occur for generalized reinforcers such as money. Tokens used in a token economy are another example of a generalized conditioned reinforcer because they are exchanged for various other backup reinforcers. As a result, people can accumulate tokens without rapid satiation. Praise is also a generalized conditioned reinforcer because praise is paired with numerous other reinforcers across a person's lifetime.

Variable Interval (VI) Schedule

•Reinforcer for the first response after X amount of time on the average •Produces low but steady rate of behavior •Rarely used in behavior modification the reinforcer is delivered for the first response that occurs after an interval of time has elapsed. The difference is that in a VI schedule, each time interval is a different length. The interval varies around an average time. For example, in a variable interval 20-second (VI 20-second) schedule, sometimes the interval is more than 20 seconds and other times it is less than 20 seconds. The interval length is not predictable each time, but the average length is 20 seconds. Ferster and Skinner (1957) investigated various VI schedules of reinforcement. They found that the pattern of responding on a VI schedule was different from that on an FI schedule. On the VI schedule, the pigeon's behavior (pecking the key) occurred at a steady rate, whereas on the FI schedule, the frequency decreased in the early part of the interval and increased near the end of the interval. Because the length of the interval—and thus the availability of the reinforcer—was unpredictable in a VI schedule, this off-and-on pattern of responding did not develop.

Extinction Burst

•The behavior may briefly increase in frequency, duration, or intensity before it ultimately stops. •Novel behaviors (behaviors that do not typically occur in a particular situation) may occur for a brief period when a behavior is no longer reinforced. •Emotional responses or aggressive behavior may occur.


Ensembles d'études connexes

CCNA 2 v7 Modules 14 - 16: Routing Concepts and Configuration

View Set

Exam 2: Questions from powerpoints

View Set

1013 Communication (Patient Teaching) Quiz

View Set

Chapter 15 - Sensory Pathways and the Somatic Nervous System

View Set

Photosynthesis/Respiration terms

View Set

Microbial Metabolism Chapter 6: Nester

View Set

BLS 342 Chapter 12 Intellectual Property

View Set