Operant Conditioning

Ace your homework & exams now with Quizwiz!

Consequent Stimuli

Stimulus events that follow behavior Produced by, or occur as a consequence of behavior

Matching Law

The relative rate of responding on a particular response alternative equals the relative rate of reinforcement for that response alternative

Fixed Ratio (FR)

a fixed number of responses must occur in order for the next response to be reinforced - High and steady rate with postreinforcement pauses - Larger ratios → longer pauses

Negative Punishment

when a certain desired stimulus is removed after a particular undesired behavior is exhibited, resulting in the behavior happening less often in the future. • Process by which response contingent REMOVAL of a stimulus decreases the probability of behavior • If the participant performs the response the stimulus is removed. • If the participant does not perform the response the stimulus is not removed • Example: o Stay out past curfew → loose car privileges o Doesn't stay out past curfew → doesn't loose car privileges

Operant Conditioning

learning from the consequences of our behavior

fact

When thinking about reinforcement, always remember that the end result is to try to increase the behavior, whereas punishment procedures are used to decrease behavior. For positive reinforcement, think of it as adding something positive in order to increase a response. For negative reinforcement, think of it as taking something negative away in order to increase a response.

Variable Interval Schedule (VI)

reinforcement is provided for the first response that occurs after a variable amount of time from the last reinforcement - Responding is relatively constant -Little, if any, postreinforcement pauses

Fixed Interval Schedule (FI)

reinforcer is delivered for the first response that occurs after a fixed amount of time following the last reinforcer - Responding following the postreinforcement pause is very slow - Close to the end of the interval response rate increases

Three Term Contingency

- also referred to as the ABCs of behavior (antecedent-behavior-consequence) -Conceptual system for classifying behavior in relation to antecedent and consequent stimuli - llustrates how behavior is elicited by the environment and how the consequences of behavior can affect its future occurrence EX: Antecedent: Traffic light turns red Behavior: Press foot onto brakes Consequence: Stop at intersection Antecedent: Teacher flicks lights on and off Behavior: Students lower their voices Consequence: Class is quiet

Contributions of the Premack principle

1) Reinforcers can be seen as responses rather than as stimuli 2) Greatly expanded the range of things investigators started to use as reinforcers - ANY activity can serve as a reinforcer, provided it is more likely than the instrumental response

Magazine training

stimulus is repeatedly paired with the reinforcer to enable the participant to go and get the reinforcer when it is presented

Variable Ratio (VR)

the number of responses necessary to produce reinforcement varies from trial to trial. - Higher response rate than on FR schedule - No postreinforcement pauses

Four types of contingencies

- Positive Reinforcement - Negative Reinforcement -Positive Punishment - Negative Punishment Negative reinforcement should not be thought of as a punishment procedure. With negative reinforcement, you are increasing a behavior, whereas with punishment, you are decreasing a behavior.

Premack Principle

-a principle of reinforcement which states that an opportunity to engage in more probable behaviors (or activities) will reinforce less probable behaviors (or activities). For example, if a child enjoys playing computer games (more probable) and avoids completing math problems (less probable), we might allow her to play the computer after (contingent upon) completing 15 math problems -suggests that if a person wants to perform a given activity, the person will perform a less desirable activity to get at the more desirable activity; that is, activities may themselves be reinforcers - Experiment with rats: Rats were placed in a box containing a running wheel equipped with a brake and a drinking spout that could be extended into the side of the wheel housing while the brake was applied. In the contingency phases, the rat was required to engage in a specified amount of one activity in order to gain access to the other activity

Interval Schedules

A response is reinforced only if it occurs after a set amount of time following the last reinforcement - Fixed Interval Schedule (FI) - Variable Interval Schedule (VI)

Antecedent Stimuli

Environmental event that comes before the behavior of interest

Premack (1965)

One of his earliest studies was conducted with young children. He gave the children two response alternatives, eating candy or playing a pinball machine, and determined which of these behaviors was more probable for each child. Some of the children preferred one activity, some the other.eating was the reinforcing response, and playing pinball served as the instrumental response; that is, the children had to play pinball in order to eat candy. Only the children who preferred eating candy over playing pinball showed a reinforcement effect.

Ratio Schedules

Reinforcement depends only on the NUMBER of responses the organism has performed - Continuous Reinforcement (CRF) - Fixed Ratio (FR) - Variable Ratio (VR)

fact

With punishment, always remember that the end result is to try to decrease the undesired behavior. Positive punishment involves adding a negative consequence after an undesired behavior is emitted to decrease future responses. Negative punishment includes taking away a certain desired item after the undesired behavior happens in order to decrease future responses.

Equipotentiality hypothesis

all areas of the brain are equally able to perform a task "all stimuli should support similar conditioning ??"

Edward L. Thorndike

believed learning occurred through trial and error •Puzzle Box: He placed a cat in the puzzle box, which was encourage to escape to reach a scrap of fish placed outside. Thorndike would put a cat into the box and time how long it took to escape. The cats experimented with different ways to escape the puzzle box and reach the fish.Eventually they would stumble upon the lever which opened the cage. When it had escaped it was put in again, and once more the time it took to escape was noted. In successive trials the cats would learn that pressing the lever would have favorable consequences and they would adopt this behavior, becoming increasingly quick at pressing the lever. •Learning curve: the rate of a person's progress in gaining experience or new skills. i.e. a graph displaying the rate at which the person/ animal took to do a task the first time, second, and third, and so on. •Law of Effect: If a response in a presence of a stimulus is followed by a satisfying event, the association between the stimulus and the response will be strengthened.; if the response is followed by an annoying event, the association will be weakened

Primary Reinforcers

events that are innately reinforcing; naturally reinforcing, i.e. there is no learning necessary for them to be reinforcing ex: water when we are thirsty or food when we are hungry

Secondary Reinforcers

events that are reinforcing because it has been associated with some other reinforcer. refers to a stimulus that gains reinforcing properties because it is associated with a primary reinforcer. Under this conception, money is a secondary reinforcer because having it allows greater access to primary reinforces such as food, clothing, etc Ex: money for people; grades, applause

Continuous Reinforcement (CRF)

every occurrence of the instrumental response results in the delivery of the reinforcer - Organisms typically respond at a steady but moderate rate

Schedule of Reinforcement

method of delivering reinforcement dependent on numerical and/or temporal dimensions of behavior •Response-reinforcer arrangements: - Certain number of responses - Period of elapsed time

Punishment

process of delivering these stimuli contingent on behavior; the administration of aversive stimulus to reduce or eliminate unwanted behavior. It can be either physical or nonphysical

Extinction

the disappearance of a previously learned behavior when the behavior is not reinforced •Nonreinforcement of a response that was previously reinforced •Behavioral effects -extinction burst: will often occur when the extinction procedure has just begun. This usually consists of a sudden and temporary increase in the response's frequency, followed by the eventual decline and extinction of the behavior targeted for elimination -spontaneous recovery •Emotional effects - Frustration

Stimulus generalization

the organism responds in a similar fashion to 2 or more stimulus

Learning Curve

the rate of a person's progress in gaining experience or new skills. i.e. a graph displaying the rate at which the person/ animal took to do a task/ learn the first time, second, and third, and so on.

spontaneous recovery

the reappearance of a response after its extinction has been followed by a period of rest

Reinforcement

used to help increase the probability that a specific behavior will occur in the future by delivering a stimulus immediately after a response/behavior is exhibited. Another way to put it is that positive reinforcement is adding something that will motivate the child (or individual) to increase the likelihood they will engage in that behavior again. process of delivering these stimuli contingent on behavior the process of encouraging or establishing a belief or pattern of behavior, especially by encouragement or reward.

extinction burst

will often occur when the extinction procedure has just begun. This usually consists of a sudden and temporary increase in the response's frequency, followed by the eventual decline and extinction of the behavior targeted for elimination

Positive punishment

• Process by which behavior is reduced or eliminated by the response-contingent PRESENTATION of an aversive stimulus. works by presenting a negative consequence after an undesired behavior is exhibited, making the behavior less likely to happen in the future • If the participant performs the response the aversive stimulus is presented • If the participant does not perform the response the aversive event is not presented. EX: A child picks his nose during class (behavior) and the teacher reprimands him (negative stimulus) in front of his classmates A child grabs a toy from another child (behavior) and is sent to time out (negative stimulus)

Reinforcers

• any consequence that causes the preceding behavior to increase. The increase may be in intensity, frequency, magnitude or some other quality • Stimuli that strengthen a behavior (increases its probability) • Reinforcement: process of delivering these stimuli contingent on behavior -Primary Reinforcers: events that are innately reinforcing; ex: water when we are thirsty or food when we are hungry - Secondary Reinforcers: events that are reinforcing because it has been associated with some other reinforce. Ex: money for people; grades, applause

Negative Reinforcement

• when a certain stimulus (usually an aversive stimulus) is removed after a particular behavior is exhibited. The likelihood of the particular behavior occurring again in the future is increased because of removing/avoiding the negative consequence • Process by which response-contingent REMOVAL of a stimulus increases the probability of the behavior • If the participant performs the response the aversive stimulus is terminated or prevented from occurring • If the participant does not perform the response the aversive event is presented •Response terminates the aversive event - Escape •Example: oOpen umbrella → stop getting rained on oDon't open umbrella → get rained on oTake medicine → eliminate headache oDon't take medicine → doesn't eliminate headache •Response prevents the aversive event - Avoidance •Example: o Not going out (weather channel) → avoid getting rained on o Does go out (doesn't look at weather channel) → gets rained on Bob does the dishes (behavior) in order to avoid his mother nagging (negative stimulus). Natalie can get up from the dinner table (negative stimulus) when she eats 2 bites of her broccoli (behavior).

Stimulus Control

•Extent to which an operant exhibits stimulus generalization and discrimination •If an organism responds in one way in the presence of one stimulus and in a different way in the presence of another stimulus, it is possible to conclude that its behavior has come under the control of the stimuli involved. •If an organism does not discriminate between 2 stimuli, its behavior is not under the control of those cues •Stimulus generalization: the organism responds in a similar fashion to 2 or more stimulus

Positive Reinforcement

•Process by which response-contingent PRESENTATION of a stimulus increases the probability of the behavior. •Presenting a motivating/reinforcing stimulus to the person after the desired behavior is exhibited, making the behavior more likely to happen in the future • stimulus which increases the frequency of a particular behavior using pleasant rewards •the offering of desirable effects or consequences for a behavior with the intention of increasing the chance of that behavior being repeated in the future •If the participant performs the response it receives the reinforcing stimulus. •If the participant does not perform the response it does not receive the reinforcing stimulus • Example: o If a child puts her toys away → cookie o If a child doesn't put her toys away → no cookie o Study hard for PSY 311 exam → A o Don't study hard for PSY 311 exam → no A

Shaping

•Shaping involves 2 complimentary tatics: 1) Reinforcement of successive approximations to the required response 2) Nonreinforcement of earlier response forms This is a behavioral term that refers to gradually molding or training an organism to perform a specific response (behavior) by reinforcing any responses that are similar to the desired response. For example, a researcher can use this to train a rat to press a lever during an experiment (since rats are not born with the instinct to press a lever in a cage during an experiment). To start, the researcher may reward the rat when it makes any movement at all in the direction of the lever. Then, the rat has to actually take a step toward the lever to get rewarded. Then, it has to go over to the lever to get rewarded (remember, it will not receive any reward for doing the earlier behaviors now...it must make a more advanced move by going over to the lever), and so on until only pressing the lever will produce reward. The rat's behavior was "shaped" to get it to press the lever.

Punishers

•Stimuli that weaken a behavior (decreases its probability) •Decreases the probability of a behavior happening •Punishment: process of delivering these stimuli contingent on behavior; the administration of aversive stimulus to reduce or eliminate unwanted behavior. It can be either physical or nonphysical

Concurrent Schedules

•The participant (subject) can choose any one of two or more simple reinforcement schedules that are available simultaneously •Allow for the measurement of choice between simple schedule alternativesHerrnstein (1961) - Presented pigeons with different concurrent variable interval schedules - Pigeons distributed their response in a matter that matched the rate of reinforcement


Related study sets

Intro to South America Lesson 14

View Set

Chapter 38 --> Vehicle Extrication and Special Resources

View Set

SPHR Strategic Business Management Multiple Choice

View Set

Chapter 14 The Peripheral Nervous System

View Set