Learning and Conditioning Final
What is the three-term contingency? Define discriminative stimulus. What is an example of a three-term operant contingency?
-ABC of operant conditioning -antecedent ->behavior ->consequence - desks,chalkboard,teacher -> raise hand to speak ->teacher calls on you to speak
When responding in the presence of one stimulus changes as a result of changes in the reinforcement conditions during another stimulus
Behavioral contrast
The finding that there is no conditioning to a stimulus if it is presented with a previously conditioned stimulus
Blocking
A reduction in responding seen following trials in which the CS, but not the US, is presented
Extinction
True or False: A critical feature of a feedback system is that the output does not influence the input
False
True or False: By altering the order of the list of words, Ebbinghaus demonstrated the effects of contrast
False
True or False: Skinner argued that intervening variables were key to understanding psychology
False
True or False: Watson argued that unobservable events were key to studying psychology
False
True or False: When talking about a reflex, we say the stimulus is elicited by the response
False
behaving similarly in different situations
Generalization
The finding that after extinction, the response is re-learned at a faster rate
Rapid reacquisition
_____ experimented with cats escaping from puzzle boxes, illustrating the principle he called the Law of _____
Thorndike, Effect
True or False: "Hot-cold" demonstrates the basic associationist principle of contrast
True
True or False: A weakness of the associationist account is that children appear to learn complex ideas before having a grasp of simpler ones
True
True or False: An ethologist studies animal behavior in the "natural" environment
True
True or False: Intervening variables are unobservable, internal events, typically invoked to explain some behavioral process
True
True or False: One of the three basic principles of associationism is contiguity
True
A stimulus that elicits a response without prior learning
Unconditional stimulus
The principle that the noticeable difference between stimuli is proportional to the sizes of the stimuli
Weber's Law
Miller and Dollard argued that _____ was an _____ response
imitation, operant
A _____ is a verbal operant that specifies its _____
mand, antecedent
2x2 matrix: Peter sees his boss and starts working and his boss doesn't say anything to him
negative reinforcement
2x2 matrix: Phil has a splinter in his finger, he finds a tweezers and pulls it out, increasing the chance he'll do the same in the future
negative reinforcement
2x2 matrix: You're on Facebook and your prof walks toward you, you click on your prep guide and your prof doesn't say anything to you, increasing the chances that you'll close Facebook when he's nearby
negative reinforcement
2x2 matrix: You put your hand on a hot stove burn it and are less likely to put your hand on a hot stove
punishment
2x2 matrix: You walk into the bathroom and your uncle Bob is sitting on the toilet and says "whoops, I'll be done here in a second" and you never go enter the bathroom again
punishment
"looking backward" and remembering what has already happened
retrospective coding
_____ is often defined as ______ of successive approximations toward a desired behavior
shaping, reinforcement
How did the experimenters use negative punishment in their procedure?
shock-aversion therapy antabuse therapy -nausea, vomiting
A ____ is a verbal operant that is occasioned by a non-verbal ____
tact, reinforcer
Holding limited information for a short period of time
working memory
The pattern of responding on a Fixed Ratio schedule has been described as "______" while a Fixed Interval schedule produces a pattern described as a "______"
break-run, scallop
Behaving differently in different situations
discrimination
An _____ is a verbal operant _____ by other verbal bahavior
echoic, occasioned
A _____ reinforce is effective because it has been repeatedly paired with a _____ reinforcer
,primary
What is a reinforcement schedule? Describe the four basic reinforcement schedules. What does a typical behavior pattern of the four schedules look like on a cumulative record? Be sure that you are able to describe the critical components of a cumulative record.
- a rule that states under what conditions a reinforcer will be delivered -fixed ratio: when the reinforcer is delivered after x number of responses. "stop and go" pattern after each reinforcer there is a post reinforcement pause -variable ratio: number of responses required to receive reinforcer is not constant it is more random. the response pattern is rapid and pretty steady post reinforcement pause is usually short -fixed interval: reinforcer is delivered after x amount of time. "scallop pattern" after post reinforcement pause subject starts responding pretty slowly and as time goes on they start responding faster and faster -variable interval amount of time varies unpredictably between reinforcers. steady and moderate response rate no long pause after reinforcers -cumulative record components: time and cumulative responses
How does income affect consumption? Does this depend on the type of reinforcer or commodity?
- more income would buy less discounted items and buy more expensive ones -reduction in income would start buying cheaper alternatives
Describe long-delay taste aversion learning. What other examples are provided of biological constraints on classical conditioning? Why were these issues seen as problematic for general-principle learning theories?
-Garcia experiment -gave rats flavored water they never tasted before -also gave them injections that made them sick -associate taste of water with sickness avoided the water -Biological constraints -rats genetic makeup allows them to associate sickness with eating something bad -allows for better chance of survival -Problem -general-principle learning theories says animal can associate a noise or pain with taste aversion -biology says sickness is more likely to cause taste aversion - does not follow principle of contiguity -unlikely that learning can take place at all with delays of more than a few seconds and animals can associate sickness with something they ate hours before
Describe Premack's principle. This is also known as "Grandma's rule." How does Premack's principle illustrate the relativity of reinforcement? How is it used?
-Grandma says "you have to eat your vegetables before you can have dessert" -low probability behavior done before high probability behavior -Shows high probability behavior as a reinforcer for the low probability behavior - when they do the low prob behavior then they can have the high prob behavior, increasing low prob behavior in the future -rat experiment with water and running
Describe the role of the wives in this experiment. What happened to marital situations after the beginning of this program?
-Participated in marital and family counseling with husband. listing specific activities together that they would agree to perform to make the other spouse happy -community - reinforcement groups stayed married even after discussing divorce before treatment -two of four couples in the control group permanently separated or divorced
What is the matching law? Describe Herrnstein's experiment on matching and discuss three ways that behavior can deviate from perfect matching.
-The proportion of responses equals the proportion of reinforcement - Herrnstein pigeon chamber with 2 response keys on concurrent schedules -undermatching response proportions are consistently less extreme than reinforcement proportions -bias subject consistently spends more time on one alternative -overmatching response proportion is more extreme that reinforcement proportions
What was Watson's primary criticism of psychology during his time? Describe Watson's version if behaviorism. How did Skinner alter Watson's theory? What was Skinner primarily critical of? How did Miller respond to Skinner's criticism? Can you think of your own example of an intervening variable as explanatory? Is it useful or dangerous? How so?
-Watson: "black box" -psych should only deal with observable events -want psych to be a science, science only deals with observable events -Skinner: intervening variables are not useful -does not improve ability to predict - Miller: intervening variables useful when several independent and dependent variables are involved -thirsty rats, we don't know if they are actually thirsty we just say that because of the derivation and the level pressing to get water -dangerous because can easily fool ourselves into thinking we found the cause of behavior when actually it is a hypothetical and unobservable entity
Describe the conditioned suppression paradigm. Why is it called "suppression"? How is it measured? Why do psychologists use conditioned suppression? Why is it also called the conditioned emotion response (CER) procedure? Can you think of any examples of CER at work in everyday life?
-When the subject will stop what ever behavior they are doing when the CS is presented -rats in chamber where they get a shock. CS is presented to tell them shock is coming and they stop what they are doing -measured by "ongoing behavior" in the rat case it is level presses -psychologists use this because conditioning takes place in very few trials (fewer than 10) some cases there has been a significant suppression to CS after one pairing -Also called CER because it is an emotional response to the aversive stim. kind of like they are "bracing themselves"
What is a "learning set"? How does a discrimination reversal illustrate the concept of a learning set? How have discrimination reversals been used in the laboratory setting? What might these procedures help diagnose?
-an improvement in the rate of learning across a series of discrimination problems, which occurs even though the positive and negative stimuli are different from one problem to the next -learning to learn, transfer from problem to problem -discrimination reversal - when the roles of s+ and s- are switched. As time goes on they learn that sometimes the roles reverse and can adjust quicker -tasks performed incorrectly for several trials in beginning of experiment -as the experiment goes on only one trial is needed eventually -Can help diagnose brain damage and psychological disorders
Summarize and describe two-factor theory. Why was it necessary? How does the Sidman avoidance procedure shed light on two-factor theory? What was proposed to account for performance in the Sidman procedure?
-both conditional and operant conditioning are necessary for avoidance responses -without the CS of fear of shock the dog would not have learned to escape. They can not exist without the other. -Sidman's procedure at first says that there is a problem with two factor theory because the rats avoided many shocks, but there was no signal for the shocks before they came. - two factor theorists say there is no external stimulus, but the passage of time could be one
Explain how the matching law can account for the behavior on a single VI schedule as the rate of reinforcement is increased to higher and higher levels.
-built in reinforcers -non pecking behaviors; grooming, exploring, resting -projected to equal 30 food reinforcers per hour - reinforcement for pecking higher and higher and the non pecking behaviors become less
What happens to the value of reinforcers as delays are introduced? Discuss some treatment implications of delayed reinforcers.
-delayed consequences are less effective in changing behavior than the consequences delivered immediately after the target response -Cigarette smoking, followed by instant reinforcers -reinforcements of not smoking are greater in the long run, but are so far off
Describe the experiment by Hunt and Azrin. Who were the participants in the study? What was their problem? What supports did Hunt and Azrin provide their participants?
-evaluating the effectiveness of Community-reinforcement -Hospitalized alcoholics - withdrawal symptoms -housing, didactic program other hospital services -coin flip decided who out of the pair got the community-reinforcement counseling
What are some treatment implications of price and elasticity of demand?
-heroin -price -risk -law enforcement -inelastic
What were the results of the experiment? How did the researchers measure success? One participant in the control group did very well and one participant in the experimental group did poorly. What did the authors note about these cases?
-mean percent spent drinking, unemployed, away from home, and institutionalized was more than double in the control group than the community-reinforcement group -experimental group patient had IQ of 70, was single, lived with alcoholic father, low status job, did not attend social club, lack of transportation -control member was the highest functioning of the group, had a well established family and job, regularly participated in AA
Summarize the main differences between the matching law, optimization theory, and momentary maximizing theory. What has research found about the strengths and weaknesses of these competing theories?
-optimization is distributing behaviors so it maximizes the amount of reinforcers -matching is response proportion EQUALS reinforcement proportion -momentary maximizing theory is when IN THE MOMENT the animal will choose the alternative with the highest value
Describe the Jack and Jill experiment. What did it show? Why was this experiment important?
-pigeons one speaker, one listener -box with plexiglass divider -L: "what color?" S: looks behind current and sees light, translates color of light into the symbols on the buttons listener can see, pecks correct button L: "Thank you" activates speakers feeder, translates symbol to color and pecks appropriate key activates his feeder. -90% accuracy or better -both couldn't get food without each other -they were talking to each other -also experimentation of them talking to themselves without the plexiglass and only one pigeon instead of two doing the exercise on their own
The authors suggest that alcohol is a reinforcer. What contributes to the reinforcing value of alcohol according to the authors?
-pleasant relaxing state -taste -social reinforcement from family -requires more and more to keep the feelings to avoid withdrawal
What happens to consumption when unit price increases? What happens to spending when unit price increases? How is it that spending can increase when price increases?
-price goes up consumption goes down -spending goes up when price goes up -effort to maintain consumption at a level approximating what was consumed before price increase
What is Bandura's theory of imitation? What critique of Miller and Dollard's theory does Bandura offer? Why do we need an additional concept to explain imitation?
-reinforcement is not necessary for learning new behaviors through observation, but the expectation of reinforcement is essential for performance of these behaviors four factors that determine whether imitation will occur: 1) attentional process (has to be paying attention) 2) retentional process (must retain info) 3) Motor reproductive process ( appropriate motor skills) 4) incentive and motivational processes -criticisms of Miller and Dollard: theory of generalized imitation makes no provisions for distinguishing between the learning and the performance of imitative behaviors. -both theories can account for the results in different ways. another concept for terminology reasons and how much we should speculate about processes we can't observe directly
How do alternative sources of reinforcement affect treatment of psychiatric disorders? Describe an example where manipulation of an alternative source of reinforcement affects a treatment outcome.
-reinforcers that sub for target behavior (TV for exercise) -remove as many complementary reinforcers from the environment as possible and replace with substitutes for the problematic behavior - trying to quit smoking, avoid drinking too
What is a simultaneous discrimination? What is a successive discrimination? Can you provide real life examples of these processes?
-simultaneous two stimuli that need to be discriminated between at the same time -picking the right color crayon from the box -successive stimuli presented at different times -aisles at the grocery store
Give some examples of fixed action patterns. How do you know they are fixed action patterns?
-squirrel burying nuts - baby isolated (experiment) - knows what to do with nut as adult -unique to species -all members do it -has to be completed start to end (releasing stimulus) -innate - peacock dance
Describe the Ainslie-Rachlin theory, and use an everyday example to show how it accounts for the reversals in preference that occur and self-control choices.
-value of a reinforce decreases as the delay between making a choice and receiving reinforcement -value of getting a good grade in the class vs an extra hour of sleep
What is the tragedy of the commons? Give a few modern-day examples of this problem and describe some strategies that can be used to overcome the problem
Acting on short term interests, people make choices that are detrimental to society as a whole -grassy area where everybody can allow their cows to graze freely, as long as there aren't too many cows which can take decades or centuries -Hardin thinks it is inevitable that it will happen because it is to every herder's benefit to maximize their herd -Decides to add one more cow and then another and then another...
The development of an unpleasant conditioned response to stimuli associated with an undesirable behavior
Aversive counterconditioning
Conditional responses that act in the opposite direction of the URs
Compensatory responses
Discrimination between classes of stimuli and generalization within classes of stimuli
Concept formation
The finding that when learning in a particular situation, recall will be better in that same situation
Context shift effect
The Partial Reinforcement Extinction Effect (PREE) describes a situation where responding more rapidly extinguishes after ______ reinforcement when compared to responding after ______ reinforcement
Continuous, Intermittent
A teaching technique beginning with stimuli that are very easy to discriminate and slowly proceeding to ones that are more difficult
Errorless discrimination
Why is it important to have a nontreatment control group in your experiment?
It is important so you can compare and measure how effective the treatment was versus the people who had no treatment
_______ fire not only when you perform some activity, but also when you ______ someone else performing that behavior
Mirror neurons, observe
2x2 matrix: Road runner speeds, has money taken away and he's less likely to speed in the future
Omission
The proposal that during Pavlovian conditioning the CS comes to substitute for the US
Stimulus substitution theory
2x2 matrix: Billy wisecracks to his mom, gets slapped and is less likely to wisecrack in the future
punishment
2x2 matrix: Albert raises his hand in class, offers a profound insight and his instructor says "wow, that was an awesome point", increasing the likelihood that he comments in the future
positive reinforcement
2x2 matrix: Katy argues with Russell, and then they make-up, increasing the likelihood of arguing in the future
positive reinforcement
2x2 matrix: Sam clicks FB during class, sees a funny message from her friend which increases the chances she'll open FB in the future
positive reinforcement
Describe several techniques a person could use to help him avoid eating foods that are high in fat and cholesterol.
precommitment: not buying the unhealthy snacks to begin with, foods that take time to prepare the delay will decrease the value of the food, make a shopping list self-reinforcement: can watch favorite tv show only on days skip dessert visualize the body they are striving for before eating
What are some of the economic variables that affect consumer behavior?
price, alternative resources of reinforcement, discounting delayed consequences, and income
When previously learned material impairs new learning
proactive interference