Learning Study Guide

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

1. Describe the following 6 side effects of extingtion:

* Extinction burst -It is a slight increase in the behavior (the opposite of what you want to happen in extinction). * Increase in variability -trying lots of different behaviors trying to get the reward * Emotional behavior -Emotional reactions are common side effects of extinction * Aggression - This frustration and anger (the emotional piece) can lead to outbursts of behavioral aggression. * Resurgence -As a last ditch effort, we start trying anything, ANYTHING AT ALL, that has worked in the past * Depression -It's sad but it's the reality of the situation

10. What is the difference between primary and secondary punishers?

* Primary - events we are born to dislike (loud noises, shock, etc.) * Secondary - events we learn to dislike

9. What is the difference between intrinsic and extrinsic punishment?

* intrinsic - the behavior itself is punishing * extrinsic - an aversive stimulus follows the unwanted behavior

5. What is the One-Process Theory?

* it is not reduction of fear that is reinforcing * it is reduction in number of shocks that is reinforcing

20. Explain the Premack Principle. Define a LPB and HPB. How are these behaviors used as consequences in an operant conditioning experiment?

-Behaviors are reinforcers too -HPB-high probability behaviors LPB-low probability behaviors -Can increase LPB by giving HPB as reward. ex. If you eat vegetables, then you can have dessert.

19. Explain Hull's Drive Reduction Theory.

-Perform a behavior to reduce a physiological drive -Ex. Go to restaurant to reduce hunger drive -Behavior not motivated just by drive reduction -Incentive motivation-motivation we get from some property of the reward, not some internal state

13. What is shaping? How did we learn about shaping in class?

-Process of rewarding baby steps. -Gradual creation of a new behavior through reinforcement of successive approximations of the end behavior.

a. What factor is associated with the partial reinforcement effect? b. What IS the partial reinforcement effect?

-Schedule of reinforcement -Partial Reinforcement Effect- behavior maintained by intermittent reinforcement will extinguish more slowly than behavior maintained on a continuous schedule.

1. Describe the puzzle box experiment and Thorndike's findings from this experiment.

. That researcher is Edward Thorndike. He studied how cats learned how to open a door in a contraption called a puzzle box. Only one behavior would open the door to allow the cat to escape the box and reach a food treat. Every time the cat escaped, it was put back to do it all over again. Over time, the cat got faster and faster at escaping. In other words, the cat learned exactly what behavior worked to allow them out of the box. How the cats learned to perform certain behaviors and stop performing others led Thorndike to propose what he called "The Law of Effect".

6. How is animal escape/avoidance behavior different from human phobic behavior?

1. CS vs. US Animals: perform operant response to escape an aversive stimulus (US) Humans: avoid the CS (not just the US) 2. Duration of training Animals: takes many trials to condition avoidance behavior Humans: will avoid object/situation after only one exposure sometimes

2. What is the difference between avoidance and escape behavior?

1. Escape behavior * operant response removes organism from an ongoing aversive situation 2. Avoidance behavior * operant response removes organism from a situation that will become aversive

7. What three criteria would need to be met in animals if the behavior was to be considered phobic?

1. Fear response established after only ONE pairing of CS and US 2. Animal must avoid CS as well as the US 3. Animal must successfully avoid CS on 100% of subsequent trials

4. Describe 2 criticisms of Mowrer's theory and what has been proposed to "fix" those problems.

1. First, the animals persist in their fear behavior even when nothing bad has happened for ages (it doesn't seem to extinguish). Anxiety Conservation Hypothesis-Says CS exposure without shock is too brief to undergo extinction 2. some critics say that the fear behavior continues even after the fear is gone.-Outward fear may diminish but inward fear still exists

13. Under what conditions might punishment be effective?

1. Immediate 2. Consistent 3. When start with more severe punishment and then back off 4. Negative works better than positive punishment 5. When accompanied by explanation 6. When accompanied by reinforcement for 'good' behavior

12. Name 3 potential benefits of using punishment.

1. May increase social behavior * it's natural to act affiliative after getting in trouble 2. May improve mood * may distract a distraught child 3. May increase attention to environment * ex. Get yelled at for walking into street

11. Discuss several problems with using punishment.

1. Only tells you what NOT to do 2. Person delivering punishment becomes discriminative stimulus for punishment 3. Subject might learn to avoid person who gives punishment, not eliminate unwanted behavior 4. Punishment usually elicits strong emotional response 5. Punishment sometimes elicits aggressive behavior 6. Might be teaching subject that punishment is an appropriate way to control behavior 7. Punishment usually immediately stops the unwanted behavior so use of punishment gets reinforced

5. Describe the ABC's of operant conditioning.

1. Response - the behavior that the organism performs 2. Consequence - can be appetitive or aversive and comes after the behavior Sometimes... 3. Discriminative stimulus - signal that tells the organism that a consequence is available if the behavior is performed

2. We talked about several factors that affected resistance to extinction:

1. Schedule of Reinforcement-VR is most resistant because it is hard to distinguish between empty trial and extinction trial. Partial Reinforcement Effect- behavior maintained by intermittent reinforcement will extinguish more slowly than behavior maintained on a continuous schedule. 2. History of Reinforcement-the more reinforcers an animal has received for a behavior, the harder it will be to extinguish 3. Magnitude of Reinforcer-Small rewards are easier to extinguish than large. Non-preferred rewards are easier to extinguish than preferred rewards 4. Degree of Deprivation-Hungry animal is harder to extinguish 5. Previous Experience with Extinction-. If they have already had a behavior extinguished before, there is some prior knowledge to draw upon in the current situation. It will be easier to extinguish an "experienced" individual than a novice one. 6. Distinctive Signal for Extinction-Easier to extinguish if there is a cue that tells the organism that extinction is in effect

14. Describe Seligman & Maier's experiment on learned helplessness.

3 groups of dogs: Phase 1 1. Inescapable shock - dogs got shocked no matter what 2. Escapable shock - dogs got shocked but could stop it by pushing button with nose 3. No shock control - dogs were never shocked Phase 2 Dogs were put in shuttle box * dogs could escape by jumping over barrier * shock was signaled by 10 sec of dark Results: Escapable shock and no shock control dogs jumped over barrier Inescapable shock dogs just lay down and whined (never even tried to jump over barrier) Why did this happen? Two ideas: 1. Dogs gave up because they didn't think there was anything they could do 2. Dogs were too distraught (emotionally) to pay attention to learning something new

16. Shaping is an example of what type of complex schedule of reinforcement?

Adjusting Schedule

15. In conjunctive schedules of reinforcement, is reinforcement given after each part of the schedule is completed or only at the end?

At the end

17. How do chained schedules differ from conjunctive schedules?

Chained schedules- must be completed in a certain order Conjunctive Schedules-2 or more simple schedules combined

1. What is the difference between continuous and intermittent schedules of reinforcement?

Continuous- rewarding every time Intermittent- rewarding some of the time.

8. We discussed two ways of getting rid of unwanted behaviors.

DRO-Differential Reinforcement of other behaviors-reinforce any behavior other than the one you are trying to get rid of. Ex. Trying to teach siblings to not fight DRI-Differential Reinforcement of incompatible behaviors-reinforce a different behavior that makes it impossible to do the unwanted behavior. Ex. Trying to get siblings to not fight.

10. What is a DRH schedule of reinforcement?

Differential Reinforcement of high rates of responding- punished for going slow, rewarded for going fast

11. A DRL?

Differential reinforcement of low rates of responding-reward for going slow

12. A DRP?

Differential reinforcement of paced responding- fast sometimes, slow sometimes

9. When does the peak-shift effect occur - during generalization or discrimination?

Discrimination -The highest responding is not at the trained stimulus, but slightly shifted.

6. Having more experience with extinction makes it easier or harder to extinguish another behavior?

Easier

7. How do discriminative stimuli play a role in extinction?

Easier to extinguish if there is a cue that tells the organism that extinction is in effect

2. What is the law of effect (who proposed it and what does it say)?

Edward Thorndike: Any behavior that is followed by pleasant consequences is likely to be repeated, and any behavior followed by unpleasant consequences is likely to be stopped. For example, if you answer a question in class by calling out your answer and the teacher throws a rock at you for it, you will likely stop yelling out your answers. On the other hand, if you answer a question in class and the teacher tosses you a piece of candy, you will be more likely to answer questions in the future.

9. What is the difference between duration (FD, VD) and interval schedules (FI, VI)?

Fixed Duration-if you perform behavior for a certain amount of time, get reward Variable Duration-If behavior continues for an average amount of time, you get a reward

5. Of the four intermittent schedules we discussed, which is associated with a scallop?

Fixed Interval More responding closer to time of reward.

What is the difference between fixed interval and variable interval schedule?

Fixed Interval-reinforcement comes after a certain amount of time has elapsed Variable Interval- reinforcement comes after an average amount of time has elapsed.

4. What is the difference between a fixed and variable schedule?

Fixed Ratio- reinforcement is provided after a specific number of responses have been made. Variable- reinforcement is provided after and average number of responses have been made

6. Which is associated with a post-reinforcement pause?

Fixed ratio Ex. Win a game, practice is cancelled.

14. What is the difference between fixed or variable time schedules and fixed or variable interval schedules? What type of schedule is a fixed time schedule?

Fixed time schedule-reinforcement comes after a certain amount of time regardless of whether organism has responded Variable time schedule-reinforcement comes after an average amount of time regardless of whether organism has responded. Both Non Contingent Schedules

18. What is the difference between forward-chaining and backward-chaining? Which typically works better? Why?

Forward-Tube (food)-> tube+hurdle(food)->tube+hurdle+weave(food) Backward-Weave(food)->hurdle+weave(food)-> tube+hurdle+weave(food) Backward works better because animal organism knows when the reinforcement is coming.

3. How does Skinner's operant chamber differ from Thorndike's puzzle box?

He respected what Thorndike did but thought there weren't enough experimental controls. He set out to study observational learning in a laboratory setting. The boxes you see are called operant chambers (but have also been referred to a Skinner boxes). The boxes could be controlled by a computer to either deliver food or a shock (pleasant or an unpleasant consequence) to see how those consequences would affect behavior. Wanted more control over study of behavior: Free operant procedure: - if you respond, you get fed Believed all behavior could be divided into 2 categories: - Respondent behavior (classical conditioning) - Operant behavior (operant conditioning)

5. Is it easier to extinguish a behavior in a deprived or full organism? What principle does this example represent?

Full organism. Degree of Deprivation

3. Is it easier or harder to extinguish a behavior that has a long history of reinforcement?

Harder

4. Is it easier or harder to extinguish a behavior that has been rewarded with a large (rather than small) reward?

Harder

10. Is immediate or delayed reinforcement better? Why?

Immediate

8. Describe Stampfl's experiment on animal phobic behavior (use a drawing if necessary).

In this experiment, they had a long alleyway. One end was painted black and one end was painted white. Rats are nocturnal (active at night) so tend to prefer the dark (black) side. Rats are given the opportunity just to explore and all rats end up staying primarily at the end of the maze. Once the exploration phase is over, the rats learn that the black end is where they receive a shock. This happens only once to them. Rats are placed in the start box, they walk to the dark end and receive a shock. Notice that the walls leading up to the end box are painted black. These black painted walls are the CS -the cue that says shock is coming. Next the experimenters activated a conveyer belt. Rats are put in the start box and a conveyer belt starts moving, bringing them ever closer to the black box where they received the one shock. The researchers are interested in seeing what the rats do when they reach the black painted walls (the cue, or CS, for shock coming). In this experiment, when the rats got to the black walls, they turned around and ran away. The conveyer belt kept moving forward so they kept running backwards. Next they installed photobeams. The photobeams are set to activate or stop the conveyer belt. If the rat runs backwards through the photobeams it will stop the conveyer belt for 3 min (and allow the rats to avoid the shock container). Rats readily learn to break the photobeams to stop the conveyer belt. In fact, and quite importantly, they do not wait until they get to the black painted walls to run back to break the beam. As soon as they pass it, they turn around and run back. They are now avoiding both the black chamber AND the black painted walls (the CS). Next the researchers increased the # of times the rats had to break the photobeam to 10. Rats would ride the conveyer just past the photobeam, turn around break it, and repeat 10 times. They were quite efficient in their behavior. If we evaluate this experiment, and the rat's behavior, we find that it DOES meet the criterion for human-like phobic behavior. The rats only got one shock. They avoided the area where shock occurred AND the cue for shock (black walls). They also never had to relearn - they continued avoiding the walls and shock 100% of the time even though they were never shocked again (no extinction). Some rats continued for more then 1000 trials. The researchers finally stopped after 1000 trials.

3. Schedules based on the amount of time that has passed are considered ________?

Interval Schedules

12. What is the difference between intrinsic and extrinsic reinforcement?

Intrinsic-feel god about it Extrinsic-external reward

12. What is the difference between negative contrast and positive contrast? Are these examples of behavioral or anticipatory contrast?

Negative contrast- responding decreases to second schedule when first schedule gets better Positive contrast- responding increases to second schedule when first schedule gets worse. Both are types of behavioral contrast

14. How is discrimination training used in animals?

One application is targeting. Targeting, in animal training, means that you pay attention, looking for a specific stimulus that means you should respond. The picture on the left is a dolphin looking at an underwater tv screen. It's job is to press a paddle whenever it sees a specific picture (not the circles you see here). Every time they see the "right" picture, they hit the paddle. This work, called vigilance, mimics the work that air traffic controller or radar specialists do - staring at a screen looking for an unusual blip. The picture on the right is a search and rescue dog. There are different types of search and rescue dogs - some are trained to find survivors while others are trained to find cadavers. They are trained to target when they find the specific object they are looking for.

9. Draw the box with all 4 contingencies and what they mean. Come up with an example for each of the 4 types of consequences (contingency = consequence here).

Postive Reinforcement-treat Negative Reinforcement-taking a shower mean bad smell goes away Positive Punishment- Negative Punishment-

11. What is the difference between a primary and secondary reinforcer?

Primary-one that is inherently reinforcing (food, water, sex) Secondary- one that is reinforcing because it has been associated with a primary(money, power)

2. Schedules based on the number of responses an organism makes are considered ______?

Ratio Schedules

6. What 2 classes of consequences are there? How does each class affect behavior?

Reinforcers - increase the likelihood of a behavior occurring again Punishers - decrease the likelihood of a behavior occurring again

13. The three schedules in #10, 11, and 12 are all examples of what type of schedule?

Response Rate schedules

15. How is discrimination training related to studying?

Studying would also be more effective if it was put on stimulus control. If you had a set environment in which to study- without distractions - it would be easier to get into the act of studying and maintain focus. Often students check their phone or computer for social media sites or chat with friends or listen to music or get snacks or try to sunbath and study. All of these divide your attention. If you study at a desk, without distractions, all the time then you are setting the stage for a focused study session. Over time, the environmental cues will be associated with studying and it will become easier and easier to focus.

13. What is delayed matching to sample used to study?

Targeting???

7. How do you extinguish an operant behavior?

The easiest way to do this is to stop rewarding the behavior and it will naturally weaken and disappear. For example, if a child is throwing a temper tantrum in a store, your first inclination might be to give them what they want to get them to be quiet or to tell them to behave. The best way to get rid of the behavior is to simply ignore it - do NOT reward the behavior and it will disappear (it might take awhile and be very embarrassing in the meantime).

16. How is discrimination training related to sleeping?

The idea here is that you want your bed to be associated with sleep. If you make bed the cue for sleep, it will be easier to fall asleep each evening. If, however, you use your bed for homework, watching tv, chatting with friends, reading, etc. then you are not putting sleep on stimulus control - you are using that stimulus as a cue for lots and lots of activities, not just sleep. It's harder for your body to understand that it needs to go to sleep if there is no spot that is consistently associated with sleep.

4. What 2 classes of behavior did Skinner believe there were? Describe each.

The responses in classical conditioning he called respondent behavior (because you respond to a stimulus). Behavior that gets modified by consequences were called operant behaviors (also because the individual OPERATES on their environment).

1. Describe what a shuttle box is and how it is used.

This was originally studied with dogs in an experiment called the shuttle box experiment. The enclosure the dog is in is called a shuttle box. The dog can jump from side to side over a barrier (in blue). The floor is metal and hooked up to a shock generator. Experimenters can deliver shock to the left and right sides of the box independently. The dogs in the initial experiment learned to jump over the barrier when they felt a shock to get to the side that was shock free. This is escape behavior. In future trials, experimenters paired a light with the shock (a little classical conditioning where the light predicts shock). The dogs saw the light and learned to jump over the barrier to the safe side before shock was ever delivered. This is avoidance behavior.

8. Antecedents are often called discriminative stimuli. Describe how discriminative stimuli factor into operant conditioning for animals and humans.

These are cues that tell us whether reinforcers are available: Animals: If light on: Lever press food If light off: Lever press nothing Humans: If Jen is in the room: tell joke laugh If no Jen: tell joke no laugh

11. Does the response change come before or after the rf change in anticipatory contrast?

response rate varies inversely with an upcoming change in reinforcement rate

3. What two processes are involved in Mowrer's Two-Process Theory?

Two process involved in avoidance: 1. Classical conditioning light + shock fear light fear 2. Operant conditioning light + climb over barrier reduce fear

8. Which has the lowest rate of responding?

Variable Interval

7. Which has the highest rate of responding?

Variable Schedule

10. Does the response change come before or after the rf change in behavioral contrast?

change rate of rf in one schedule causes change in responding to the other schedule


Set pelajaran terkait

AP Psychology (So far Module 13-14)

View Set

Basic Appraisal Principles - Chapter 1 Study Guide

View Set

Growth and Development of the Newborn and Infant

View Set

NUR 304 Young, middle-aged, older adult

View Set