UCLA Psych 110 Final: Experiments

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

Atkins

Control by Contextual cues Study of sexual conditioning used contextual cues as a signal for sexual reinforcement in which the same way that a discrete CS might be used Male domesticated quail served in the experiment, which was conducted in an apparatus made up of two adjacent compartments. Before the start of the conditioning trials, the birds were allowed to move back and forth between the two compartments to determine their baseline preference. The nonpreferred compartment was then designated to serve as the CS Conditioning trials consisted of placing a male bird in its CS context for 5 min at which point a sexually receptive female was placed with them for another 5 min The birds received exposure to the CS context paired with sexual reinforcement Birds in the control group received access to a female in their home cages 2 hours before being placed in the CS context, making the CS and US presentations unpaired Preference tests were also conducted after the 5th and 10th conditioning trials Results: The paired and unpaired groups showed similar low preference for the CS compartment at the outset of the experiment Low preference persisted in control groups The birds that received the CS context paired with sexual reinforcement came to prefer that context; the association of contextual cues with the sexual reinforcement increased preference for those cues Illustrate that contextual cues can become to control behavior if they serve as a signal from a US or a reinforcer

Consummatory-Response (Sheffield, 1954)-

species-typical consummatory responses (example = eating, drinking) are critical feature of reinforcers "reinforcer responses" any behavior that completes the sequence Study consummatory response not the reinforcer stimuli Special because they occur in the presence of a reinforcer Eating behavior such as chewing and swallowing can occur for any reinforcing "food" item Problem with this approach is that in instrumental conditioning Instrumental behaviors are "typically low probability" unusual behaviors Bar pressing for food, unusual and not consummatory Pecking for food, more typical but not consummatory

Vladimir Bechterev (1913)

studied associative learning in humans Signaled avoidance procedure - Trial starts with presentation of a CS If no specific response is made, aversive US occurs a specific time after CS onset If a response is made - the avoidance response - after the onset of the CS but before the US occurs, the US is omitted Bechterev did not use a standard classical conditioning procedure This went unnoticed until a study by Brogden, Lipman, & Culler (1938)

Kamin (1957)

4 groups of rats in shuttlebox Group 1 - avoids shock & terminates (CS) signal Group 2 - avoids shock & signal (CS) remains on Group 3 - receives shock & terminates (CS) signal Group 4 - receives shock & signal (CS) remains on Results show support for the two-process theory Group 1 showed the most avoidance responding

Rescorla 2004

A study involving goal tracking as the conditioned response illustrates this During the initial acquisition phase, a noise CS was paired with the presentation of food in a cup As conditioning progressed, the treats came to poke thie nose in the food cup when the CS was presented Extinction was then introduced with eight CS alone trials conducted during each daily session Results During each extinction session, responding was highest during the first two trials and then declined When the rays were returned to the experimental chamber for the next session, responding was higher than it had been at the end of the preceding session, indicating spontaneous recovery However, the degree of recovery progressively declined with repeated extinction/test sessions Acquisition creates expectation that US will occur. Expectation is violated when US is omitted in extinction; error corrected by reduced responding on subsequent extinction trials. Compounding two conditioned stimuli increases the resulting error when the trial ends without reinforcer; inducing larger correction and greater reduction of responding (similar to Rescorla-Wagner) Entirely different outcome if extinction cue is compounded with conditioned inhibitor during extinction training; interference rather than facilitation of extinction process

Herman and Azrin

Availability of multiple reinforcers Study of adult male smokers The participants were seated facing two response levers Pressing ether lever was reinforced with a cigarette on a VI schedule Once lever pressing occurred at a stable rate, responding on of the levers was punished by a brief obnoxious noise In one experiment condition, only one response lever was available during the punishment phase In another condition, both response levers were available, but only one of them was punished When the punished response was the only way to obtain cigarettes, punishment produced a moderate suppression of behavior By contrast, when the alternative response lever wa available, responding on the punished lever ceased although The availability of an alternative response greatly increased the suppressive effects of punishment

Ch 10

Avoidance and punishment

Seligman and Johnson (1973)

Avoidance response is driven not by S-R expectancies but by expectancies about response-outcome (R-O) contingencies During avoidance learning subjects develop two R-O expectancies: If they perform the avoidance response no shock will occur and (Stimulus - Outcome) If they do not perform the avoidance response shock will occur (Response - Outcome) This theory explains the effectiveness of US avoidance in supporting learning. It also explains the persistence of avoidance after fear is extinguished because avoidance is assumed to be driven by R-O expectancies rather than being reinforced by fear reduction

Lovibond and colleagues (2008)

College students received conditioning with three different stimuli, designated as A, B and C The stimuli were colored blocks presented on a computer screen The US was shock to the index finger at an intensity that was uncomfortable but not painful On trials with Stimulus A, an avoidance conditioning procedure was in effect Stimulus A was presented for 5 seconds, followed by shock 10 seconds later (A+) If participant pressed the correct button during the CS, shock was omitted on that trial Stimulus B received only Pavlovian conditioning as a comparison Each representation of B was followed by shock (B+) without the opportunity to avoid Stimulus C was a control cycle and was never followed by the chock (C-) To track the effects of these procedures, the participants were asked to rate their expectation that shock would occur, and their skin conductance responses were recoded as a fear of index Ratings of shock expectancy were obtained during the 10 second delay between the CS and the schedules US Results: Fear was always low for Stimulus C, because it never resulted in shock Fear increased across trials for the Pavlovian stimulus B, which ended in shock on each trial (B+) Fear decreased across trials for the avoidance stimulus (A+) The changes in fear to stimulus A and B paralleled by changes in expectancy of shock shock expectancy increased across trials for the Pavlovian Stimulus B but decreased for the avoidance Stimulus A

Lejuez 1998

College students served as the participants and air enriched with CO2 was the aversive US CO2 rather than shock was used because the investigators wanted to produce symptoms related to panic attacks CO2 inhalation produces respiratory distress, increased heart rate, and dizziness similar to what occurs during a panic attack During the experiment, the students were asked to wear a mask that usually provided room air To deliver the aversive stimulus, the room air was switched to 20% CO2 for 25 seconds Results: Results: Rates were higher during the avoidance sessions than during the control sessions As the participants acquired the avoidance response, the number of CO2 presentations they received declined These behavior changes and consequences occurred even though the CO2 presentations were not signaled by an explicit warning stimulus Safe period produced by each response (R-S interval) has to be longer than the interval between shocks that would occur without responding (S-S interval)

Rasmussen and Newland

College students worked on a concurrent schedule that involved clicking on moving targets on a computer screen Two different targets were available at the same time and clicking on each was reinforced according to a different VI schedule The reinforcer was gaining money and punisher was losing money After ripening stabilized on a concurrent schedule that involved only reinforcement in each component, a punishment contingency was added to one of the components Each participant was tested on nine variations on the concurrent schedules and the results were analyzed using the generalized matching law, with special emphasis on the bias and sensitivity parameters Results: Imposing a punishment procedure in one component of the concurrent schedule created a large bias in favor of responding on the unpunished alternative The punishment contingency caused a reduction in sensitivity to relative reinforcement rates Most interestingly, punishment was three times more effective in changing response preference than reinforcement Concluded that losing a penny is three times more punishing than earning the ae penny is reinforcing

Belke and Hancock (2003)

Compared lever pressing on a fixed interval 30 second schedule, reinforced by either sucrose or the opportunity to run in a wheel for 15 seconds In different phases on the experiment, the rats were tested with different concentrations of the sucrose reinforcer Lever pressing on the FI 30 second schedule for wheel running reinforcer and for sucrose concentrations ranging from 0 to 10 percent Rate of lever pressing in successive 5 second periods of the FI 30 second schedule Results: As expected with a fixed interval schedule, response rates increased closer to the end of the 30 second period Wheel running as the reinforcer was just as effective as 2.5% sucrose Wheel running was more effective than 0% sucrose, but at a sucrose concentration of 10%, responding for sucrose exceeded responding for running

Goodall (1984)

Compared lever-press responding in rats in the presence of two different stimuli (a tone and a light) One of the stimuli was used with a punishment procedure (PUN cue) and the other stimulus was used with a conditioned suppression procedure (CER cue) Lever pressing was always reinforced on a VI 60 second food reinforcement schedule Once baseline responding was well established, the PUN cue and the CER cue were presented periodically During the PUN cue, the rats received a brief shock every third lever press Punishment was delivered on an FR 3 schedule Each CER trial was yoked to the preceding punishment trial, so that the rays received the same number and distribution of shocks during the CER cue as they got during the immediately preceding PUN cue Shocks during the CER cue were always delivered independent of lever press behavior Results: Given the brief and mild shocks that were used, not much suppression of behavior was evident during the CER stimulus By contrast, the same number and distribution of shocks substantially suppressed responding during the punishment stimulus Delivering shocks contingent on an instrumental response is more effective in suppressing that response than delivering the aversive stimulus independent of behavior

Harris Et al 2008

Compound stimuli; positive and negative patterning Four different CSs were used in the experiment Noise, tone, flashing light, steady light Each CS presentation lasted 30 seconds and reinforced trials ended with the delivery of food into a cup Conditioned responding was nosing the food cup during the CS The assignment of the auditory and visual cues was arranged so that each compound stimulus (AB and CD) was made up of one auditory and one visual cue Training sessions consisted of six types of trials (A-, B-, AB+, C+, D+, CD-) intermixed Results: The rats were able to solve both discrimination problems With the positive training procedure, they learned to respond whenever A and B were presented together In the negative patterning procedure, they learned to withhold responding when C and D were presented alone The rats learned to respond to the combination of two cues in a manner that cannot be attributed by the sum of their responses to the individual cues Consistent with the interpretation that the stimulus configuration created by AB and CD acquired unique control over conditioned responding The learning was not fast (especially in the case of negative patterning) but it was clearly evident

Theories of Punishment

Conditioned Emotional Response Theory of Punishment Ester (1944) -conditioned suppression involves suppression of ongoing behavior elicited by stimulus associated with aversive stimulation. Behavioral suppression occurs primarily because fear-conditioned stimulus elicits freezing, which interferes with other activities Avoidance Theory of Punishment-punishment as a form of avoidance behavior Punishment and the Negative Law of Effect-positive reinforcement and punishment involve symmetrically opposite processes (historical theory) or punisher and reinforcer have equal but opposite effects (more current theory), or that punishment more effective than reinforcement (Rasmussen and Newland, 2008)

Brogden, Lipman, & Culler (1938)

Directly compared classical conditioning to avoidance learning Guinea pigs in a running wheel CS - tone US - shock Shock stimulated the guinea pigs to run in the running wheel (UR) Group1 - Classical Conditioning CS - US pairing Group 2 - Avoidance Learning US followed the CS unless the Guinea pigs ran showed that avoidance conditioning is different from standard classical conditioning Results: The avoidance group quickly learned to make the conditioned response and was responding on 100% of the trials within 8 days of training With the high level of responding, these guinea pigs managed to avoid all scheduled shocks The classical conditioning group never achieved this high level of performance The results of this study proved that avoidance conditioning is different from standard classical conditioning and ushered in years of research on instrumental avoidance

Mechanisms of the partial reinforcement extinction effect

Discrimination Hypothesis-introduction of extinction is easier to detect after continuous reinforcement than after partial reinforcement Learning is long-lasting from partial reinforcement Frustration Theory-persistence in extinction results from learning something counterintuitive, namely continue responding when you expect to be nonreinforced or frustrated Sequential Theory-assumes individuals can remember whether reinforced for performing instrumental response in recent past. Nonreward becomes cue for performing instrumental response

Schaal

Discrimination training focused on interoceptive cues Compared the strength of stimulus control by the interoceptive cues of cocaine before and after discrimination training During these tests, the pigeons received no drug or various doses of cocaine ranging from .3 to 5.6 mg (responding was not reinforced during test sessions) Results= generalization gradient as a function of drug dose is fairly flat, indicative of weak stimulus control During the next phase, the pigeons were trained to discriminate cocaine from the absence of the drug Some sessions were preceded with an injection of cocaine S+ as before and pecking was reinforced During other sessions, cocaine was not administered S- and pecking was not reinforced The pigeons learned this discrimination, responding strong during S+ sessions and much less during S- sessions Once the discrimination was established, generalization tests were conducted as before Results Generalization gradient is much steeper Much stronger control by the internal drug stimuli The greatest level of responding occurred when the pigeon was tested with the 5 mg of cocaine that had been used during reinforced sessions Virtually no responding occurred during sessions with no drug or just .3 or 1 mg of cocaine Responding also declined a bit when the test dose was 5.6 mg which exceeded the training dose Discrimination training increased stimulus control by the internal sensations created by cocaine

Kearns et al. (2005)

Drug discrimination study Experimental group Tone/Clicker (S+) with Lever pressing on VI schedule for cocaine Light (S-) no reinforcement Control group Tone/light with Lever pressing on VI schedule for cocaine on half of the trials Light only on half of the trials - Summation Testing Tone only Tone / light combination combination of tone and light reduces responding Both groups showed vigorous responding to the tone Adding the light to the tone did not disrupt responding in the control group Produced a profound suppression of lever pressing in the experimental group Suppression in the experimental group shows that a stimulus that is a signal for nonreinforcement (S-) in a discrimination procedure acquires active inhibitory properties as predicted by Spence

Azrin

Effects of schedules of punishment FR punishment Reinforced food on VI, punishment on FR while the VI reinforcement schedule remained in effect Results: When every response was shocked (FR 1 punishment), key pecking ceased entirely Higher FR schedules allowed more responses to go unpunished Higher rates of responding occurred when higher FR punishment schedules were used Some suppression of behavior was observed even when only 1000th response was followed by shock (remarkable) Punishment is more effective when it is on a *continuous* rather than partial reinforcement schedule Note how this differs from the use of reinforcers where partial reinforcement produces more persistent behavior

Balleine

Evidence of R-O associations devaluing reinforcer after conditioning Reinforcer devaluation makes reinforcer less attractive R-O associations are involved in instrumental drug-seeking behavior R-O mechanisms predominate in free operant situations, whereas S-R mechanisms activated when drug taking is a response to drug-related cues

Jenkins and Harrison

Examined how auditory stimuli that differ in pitch can come to control the pecking behavior of pigeons reinforced with food When pigeons are reinforced with food, visual cues exert stronger stimulus control than auditory cues Jenkins and Harrison found out that with the proper training procedures, the behavior of pigeons can come under the control of auditory cues They evaluated the effects of three different training procedures In all three procedures, a 1000 cps tone was present when pecking a response key was reinforced with access to food on a variable interval schedule One group of pigeons reviewed a discrimination training procedure in which the 1000 cps tone serves as the S+ and the absence of the tone served as the S- Pecking was reinforced on trials when the S+ was present but when the tone was off (S-) A second group also received discrimination training The 1000 cps tone again served as the S+ However, the S- was a 950 cps tone Each group was tested for pecking in the presence of tones of various frequencies to see how precisely pecking was controlled by pitch The control group responded nearly equally to the presence of all of the test stimuli, the pitch of the tones did not control their behavior, they acted tone deaf Each of the other two training procedures produces more stimulus control by pitch The steepest generalization gradient and hence the strongest stimulus control was observed in birds that were trained with the 1000 cps tone as S+ and the 950 cps tone as S- Pigeons that previously received discrimination training between the 1000 cps tone (S+) and the absence of tones (S-) showed an intermediate degree of stimulus control by tonal frequency

Ch 9

Extinction

Compolattaro, Schnitker, and Freeman (Classical conditioning)

Eyeblink conditioning in lab rats Low pitched tone and a high pitched tone serves as the CSs On half of the trials, one of the tones (A+) was paired with the US On the reminding trials, the other tone (B-) was presented without the US By the 15th session, the rats responded to A more than 85 percent of the time Responding to B also increased at first, but not as rapidly At the end of the experiment, the data showed a very nice differential responding to the two tones Results: Typical for discrimination training in which the reinforced A and non reinforced B stimuli are of the same modality The conditioned responding that develops to A generalizes to B at first, but with further training responding to B declines and a clear discrimination becomes evident

Jenkins (1962) and Theios (1962)

First trained one group of animals in partial reinforcement and another with continuous reinforcement Both groups then received a phase of continuous reinforcement before extinction was introduced Because extinction was introduced immediately after continuous reinforcement for both groups, extinction should have been equally noticeable or discriminable for both Nevertheless, the subjects that initially received partial reinforcement training responded more in extinction The results if Jenkins and Theios indicate that the response persistence produced by partial reinforcement does not come from greater difficulty in detecting the start of extinction Rather, individuals learn something long lasting from partial reinforcement that is carried over even if they subsequently receive continuous reinforcement

When is Positive Punishment effective?

First use of punishment is a very intense aversive stimulus For example a very hard spanking that hurts Target behavior is always followed by aversive stimulus Someone would have to be monitoring the child 24 seven Given immediately after target behavior So someone would have to be there all the time to do this Discriminative stimulus is not available Parents deliver the punishment when parents are not around there is no punishment So parents become a discriminative stimulus

Schiff, Smith, and Prochaska 1972

Flooding, response blocking Rats were trained to avoid shock in response to an auditory CA by going to a safe compartment After acquisition, the safe compartment was blocked off by a barrier, and the rats received various amount of exposure to CS without shock Different groups received 1,5, or 12 blocked trials, and on each of these trials the CS was presented for 1, 5, 10, 50 or 120 seconds The barrier blocking the avoidance response was then removed to test for extinction At the start of each test trial, the animal was placed in the apparatus and the CS was presented until the animal crossed into the safe compartment Shocks never occurred during test trials and each animal was tested until it took at least 120 seconds to cross into the safe compartments on three consecutive trials The strength of the avoidance response was measured by the number of trials required to reach this extinction criterion Results: Exposure to the C facilitated extinction of the avoidance response This effect was determined mainly by the total duration of CS exposure The number of flooding trials administered (1,5 or 12) facilitated extinction only because each trial added to the total CS exposure time Increase in the total duration of blocked exposure to the CS resulted in more extinction

Azrin, Hutchinson, Hake

Frustration Aggression induced by extinction was demonstrated by an experiment in which two pigeons were placed in the same skinner box One of them was initially reinforced for pecking a response key, while the other bird was restrained in the back corner of the experimental chamber After pecking was well established, the key pecking bird experience alternating periods of Continuous reinforcement (CRF) and extinction While reinforcement was available for pecking, the key pecking bird largely ignored the other bird in the back of the chamber When extinction was introduced, the previously rewarded pigeon attacked its innocent partner Aggression was most likely early in each extinction period and subsided thereafter Aggression also occurs if a stuffed model instead of a real pigeon was placed in the skinner box Extinction induced aggression has been observed with pigeons rats and people and can be a problem when extinction is used in behavioral therapy

Hanson 1959

Hanson examined the effects of intradimensional discrimination training on the extent to which various colors controlled pecking behaviour in pigeons All the pigeons were reinforced for pecking in the presence of a light whose wavelength was 550 nm However, independent groups differed in how similar the S- was to the S+ (how expert the pigeons had to become in telling the colors apart) One group receive discrimination training in which the S- wasa color of 590 nm wavelength For another group, the wavelength of the S- was 555, only 5 nm away from S+ Control= no discrimination training, reinforced at 550 nm Results Control Responded most to the color of the S+ stimulus and responded progressively less as the color of the test stimuli became more different from original. excitatory generalization gradient centered at the S+ Excitatory generalization gradient 590nm pigeons Responded at high rates to the 550nm color that had served as S+ Showed more generalization of the pecking response to the 540nm color and slightly higher 555nm pigeons Showed much lower rates of responding to the original S+ (550 nm) than either of the other groups Highest response rates occureed to colors of 540 and 530 nm

Monfils et al, 2009 (Schiller et al 2010)

Hypothesis tested with rats Based on the success of those experiments, the strategy was also evaluated in human fear conditioning Participants first received fear conditioning in which a colored square paired with mild shock to the wrist When participants came back the next day, some received a single exposure to the fear conditioned CS, followed by a series of extinction trials conducted either 10 min later (within the consolidation window) or 6 hours later (outside the consolidation window) A third group received extinction trials without a prior priming presentation of the CS All the participants returned to the laboratory on day 3 for a test of spontaneous recovery of fear Conditioned fear was measured in terms of the skin conductance response Results: All three groups showed substantial fear response during acquisition and little fear at the end of extinction phase When they were tested again the next day, there was substantial spontaneous recovery of dear ub the non primed control group and in the group that received extinction training outside the reconsolidation window There was no spontaneous recovery in extinction training was conducted within the consolidation window Subsequent studies showed that these effects are evident as long as a year after original training and are specific to the fer CS that is used to initially prime memory retrieval

Thomas, McKelvie, and Mah

Illustrate control by contextual cues that are not correlated with the availability of reinforcements Pigeons were first trained on a line orientation discrimination in which a vertical line serves as the S+ and a horizontal line served as the S- The birds were periodically reinforced with food for pecking on S+ trials and were not reinforced on S- trials in Skinner box After the discrimination was well learned, the contextual cues of the experimental chamber were changed In the presence of these new contextual cues, the discrimination training contingencies were reversed Now the horizontal line served as S+ and the vertical line was the S- Results: The shape of the generalization gradient in each context was appropriate to the discrimination problem that was in effect in that context In context 1, bird responded most to the 90 degree stimulus which had serves as the S+ in that context and least to 0 The opposite pattern of results occurred in context 2 Findings show that control by contextual cues can develop without one context being more strongly associated with reinforcement than another A likely possibility is that each context activated a different memory Context 1 activated the memory of reinforcement with 0- and nonreinforcement with 0 Context 2 activated the memory of reinforcement with 0 and nonreinforcement with 0- learned a Conditioned relation If Context 1 then recall of S+ Vert line If Context 2 then recall of S+ Horz line Generalization test across different degrees of line angle

O'Donnell et al, 2000

Initial Reinforcement training pressing a lever to get points that could be exchanged for money Discriminative stimuli, responding to a specific "target" line length During baseline phase, only one of the discriminative stimuli lines was presented and responses were reinforced on a Vi schedule After that, the first line was alternated with the other discriminative stimulus Responding continued to be reinforced according to the VI schedule during the second line, but now a point loss punishment contingency was also in effect With each response, points were subtracted from the total the subject obtained During punishment (note: this is also discriminative punishment) Various line lengths are presented Points continue to be given for choosing the target line length Points are taken away for responding to alternative line lengths Results Responses continued at near baseline levels for the target line length Responding to alternative line links stayed very low In this study choosing the incorrect line length was punished resulting in suppressing responses to those line lengths Taking away points was punishing Did not need to shock the students when they made the wrong choice The response suppression produced by punishment depends in part on features of the aversive stimulus

Rescorla

Laboratory rats first received discrimination training in which a common response was reinforced with food pellets whenever a light or noise stimulus (L or N) was present This training was conducted so that nonreinforcement in the presence of L or N would elicit frustration when extinction was introduced The targets of extinction were lever press and chain pull responses (Designated as R1 or R2) R1 and R2 were first reinforced with food pellets (the reinforcement of R1 and R2 did not occur in the presence of the light and noise stimuli) The reinforcement training was not expected to establish any S-R associations involving the light and noise Extinction was conducted in the third phase and consisted of presentations of L and N (to create the expectancy of reward) with either R1 or R2 available but non reinforced The extinction phase presumably established inhibitory S-R associations involving N-R1 and L-R2 The presence of these associations was tested by giving subjects the opportunity to perform R1 or R2 in the presence of the L and N stimuli If an inhibitory N-R1 association was established during extinction, the subjects were predicted to make fewer R1 than R2 responses then tested witt N They were expected to make fewer R2 than R1 responses when tested with L Differential response outcome cannot be explained in terms of changes in R-O or S-O associations because such changes should have influenced R1 and R2 equally Results Responding is shown for the intertrial interval (ITT) and in the presence of the stimulus (L or N) with which the response had been extinguished or not Responding during the stimulus with which the response had been extinguished was significantly less than responding during the alternative stimulus The extinction stimulus produced responding not significantly higher than what occurred during intertrial interval Results indicate that the extinction procedure produced an inhibitory S-R association that was specific to a particular stimulus and response

Neuringer, Kornell & Olufs

Measurement of response variability The experimental chamber had two response levers on one wall and a round response key on the opposite wall During reinforcement phase, the rats had to make three responses in a row to obtain a food pellet One group of rats was reinforced for varying its response sequences They only got food if the sequence of responses they made on a particular trial was different from what they did on earlier trials Each participant in the second, yoked, group was also required to make three responses to get reinforced, but for them there was no requirement to vary how they accomplished that After responding was well established by the reinforcement contingencies in both groups, extinction was introduced and food was no longer provided no matter what the rats did Results Reinforcement produced the expected difference between the two groups in terms of the variability of their response sequences Participants reinforced for varying their response showed much more variability than those that did not have to vary their behavior (yoked) The second group moved somewhat faster perhaps because they did not have to move as frequently from one manipulandum to another Extinction produced a decline in the rate of responding in both groups The decline in responding occurred in the face of an increase in variability of response sequence the participants performed Both groups showed a significant increase in the variability of the response sequences they performed during the extinction phase The increase in response variability was evident during the first extinction session and increased during subsequent sessions Extinction produced a decline in the number of response sequences the participants completed, but it increased the variability of those sequences

Fetsko et. al (2005) goal tracking procedure Light then Noise (CS) --- Food (US) No Light Noise (CS) --- No Food

Modulators can also control (CS-US) in classical conditioning Measure number of times head went into the food cup More responding to light-noise sequence than just noise Very little responding to just the light see figure The light facilitates responding to the noise CS This occurred even though the modulator itself did not elicit responding

Thomas et al 2009

Multiple context Extinction of fear using a conditioned suppression procedure Lab rats were first trained to press a response level on a VI 60 second scheduled with a drop of sucrose as the reinforcer to create a behavioral baseline for the measurement of conditioned suppression Fear conditioning was then conducted in which an auditory CS, termination of background noise for two minutes was paired with a brief shock 10 times Acquisition took place in a distinctive context designated as A For the next phase, the rats were moved out of the acquisition context and reviewer either 36 or 144 extinction trials For some animals, only one context (B) was used for all of the extinction trials For some animals, three different contexts were used (B, C, D) All the rats were then returned to the context of acquisition (A) and tested for renewal of conditioned fear Results By the end of the extinction training, all of the groups showed virtually no fear of the CS However, fear repapered when the rats were returned to context A The renewal effect was evident for all of the groups except for the one that received a large number of extinction trials (144) in all three different contexts Elimination of the renewal effect required extensive extinction training in multiple contexts

Summary: Factors that Affect Extinction

Number and Spacing of extinction trials: More trials lead to greater extinction More widely spaced extinction trials lead to better extinction. Repetition of extinction/test cycles: More repetitions lead to greater extinction Conducting extinction in multiple contexts: More contexts lead to more extinction learning (renewal) Reminder cues: cues present during extinction can be effective even when the CS is tested in a different context (from where extinction took place). Compounding extinction stimuli: Stimulus extinguished together with another previously trained stimulus results in deepened extinction

Foree and LoLordo

Overshadowing Two groups of pigeons were trained to press a foot treadle in the presence of a compound stimulus consisting of a red light and a tone whose pitch was 440 cps When the light tone compound was absent, responses were not reinforces For one group of pigeons, reinforcement for pressing was provided by food For the other group, pressing was reinforced by the avoidance of shock If the avoidance group pressed in the presence of the light tone compound stimulus, no shock was delivered on that trial If they failed to respond during the compound, a brief shock was periodically applies until a response occurred Results: Both groups of pigeons learned to respond during the light tone compound The researchers then wanted to see which of the two elements of the compound was primarily responsible for the behavior Tone and light were presented one at a time Pigeons conditioned with food reinforcement responded much more when tested with the light stimulus alone than when tested with the tone alone With shock avoidance referencement, the tone acquired more control over the response than the red light

Paradoxical reward effects

Overtraining Extinction Effect-more training provided with continuous reinforcement = stronger frustration during extinction; produces more rapid extinction Magnitude Reinforcement Extinction Effect-responding declines more rapidly in extinction following reinforcement with larger reinforcer Partial reinforcement Extinction Effect (PREE) -schedule of reinforcement in effect before extinction procedure determines magnitude of behavioral/emotional effects of extinction Intermittent reinforcement creates persistence in responding Example = slot machines in Las Vegas

Krank et al., 2008

Pavlovian instrumental transfer test Effects of Pavlovian CS for alcohol (ethanol) on instrumental responding reinforced by alcohol Results: The rats pressed each response lever about twice per minute before the CS was presented For the unpaired group, lever pressing did not change much when the CS was presented either on the right or the left In contrast, the paired group showed a significant increase in lever pressing during the CS period if the CS was presented on the same side as the lever the rat was pressing Show that Pavlovian CS for ethanol will increase instrumental responding reinforced by ethanol

Horsley et al 2012

Persistence in extinction was compared between individuals who rarely gambled and those who gamble frequently (more than three times a week) The instrumental conditioning task was a computer game in which the participant had to produce the correct combination of a lock that would open a treasure chest The reinforcer was seeing the gold coins in the chest and getting a congratulatory laugh and verbal feedback correct Independent groups received either continuous reinforcement or 50% partial reinforcement until they made 20 correct responses Thereafter, responding produced the message incorrect and the sound of a jammed lock tor 20 trials of extinction Results The various groups made similar numbers of responses during acquisition Among both the low and the high gamblers, partial reinforcement yielded more responses in extinction than continuous reinforcement However, the magnitude of this partial reinforcement extinction effect was greater among the high gamblers

EFF Procedure

Phase 1 - Pavlovian fear conditioning - Group 1 - delayed conditioning Group 2 - simultaneous conditioning Group 3 - gets CS and US unpaired Phase 2 - Shuttlebox avoidance training No shock presented CS is presented and shuttle avoidance responses turn off the CS The Delayed and Simultaneous groups both show decreased shuttle avoidance latencies across trials Unpaired group does not Suggesting - The response is negatively reinforced simply as a function of turning off the aversive CS This lends support for the two-process theory

Holland and Rescorla

Pioneer in research on the modulation of conditioned responding in Pavlovian conditioning Holland wanted to call the Pavlovian modulator an Occasion setter because the modulator sets the occasion for reinforcement of the target CS Rescorla wanted to call the modulator a facilitator because the modulator facilitates responding to the target CS

Church

Positive Punishment is not always effective in suppressing behavior effectiveness is determined by the type of aversive stimulus in studies with rats; shock is normally used because it is easy to control the intensity high intensity shock is very effective at suppressing the behavior lower intensity shock results in less suppression of the behavior Positive Punishment is also more effective when the aversive stimulus is intense/prolonged from the start of the punishment If low intensity aversive stimuli are used at the beginning of punishment and then gradually increased in intensity they are less effective at suppressing behavior Using intense aversive stimuli from the beginning is the most effective if an intense aversive is used initially suppress behavior with a mild aversive Another factor in punishment is how the aversive stimulus is introduced If high intensity shock is used from the outset of punishment, the instrumental response will be severely suppressed Exposure to low intensity of punishment builds resistance and makes the individual immune to the effects of more severe punishment How organisms respond during their initial exposure to punishment determines how they will respond to punishment

How else can we explain avoidance?

Positive Reinforcement through Conditioned Inhibition of Fear or Conditioned Safety Signals Reinforcement of Avoidance through Reduction of Shock Frequency Avoidance and Species-Specific Defense Reactions Predatory Imminence and Defensive and Recuperative Behaviors Expectancy Theory of Avoidance

Holz & Azrin (1961)

Punishment as a Signal for the Availability of Positive Reinforcement Pigeons trained to peck for food Then punished for pecking for food with a mild shock Reduced behavior by 50% Alternated with periods of no punishment Food was available only during punishment No environmental cues such as light or tone Punishment becomes a signal for food reinforcement

Demonstrations of Sidman Avoidance Learning

Rate of responding is controlled by length of S-S and R-S intervals The more frequently shocks are scheduled in absence of responding (S-S interval), the more likely the participant is to learn avoidance response Increasing duration of safety produced by R-S interval also promotes avoidance responding Safe period produced by each response (R-S interval) has to be longer than the interval between shocks that would occur without responding (S-S interval) Efficient free-operant avoidance responding requires keeping track of time There are no external cues to signal when R-S interval is about to end; internal temporal cues must be used

Bolles

Recognized this problem and focused on what controls an organism's behavior during the early stages of avoidance training Bolles assumed that aversive stimuli and situations elicit strong unconditioned or innate responses These innate responses are assumed to have evolved because during the course of evolution they were successful in defense against pain and injury Species-specific defense reactions (SSDRs)

LeBar and Phelps 2005

Reinstatement effect Yale undergraduates served as participants The CS was a blue square presented on a computer screen for 4 seconds On each acquisition trial, the Cs ended with 1 second burst of very loud nose (The US) This resulted in the conditioning of fear to the visual CS which was measure by acquisitional trial followed by eight extinction trials Four reinstatement noise bursts were then presented either in the same test room or in a different room All the students were tested for fear of the CS in the original context Results Conditioned fear increased during the course of origina acquisition and decreased during extinction Subsequent US presentation in the same room resulted in recovery of extinguished skin conductance response US presentations in a different room did not produce this recover The reinstatement effect was context specific

Bouton

Renewal effects Examined the renewal effects in extinction of instrumental lever press responding in lab rats Standard skinner boxes were modified to create distinctively different contexts for the different phases of the experiment Context A and Context B was used equally often. Rats received five 30 min sessions of acquisition during which lever pressing was reinforced with a food pellet on a VI 30 second schedule of reinforcement in context A Results: By the end of the acquisition phase, the rats were responding nearly 30 times per minute As expected, responding declined during the new four sessions when extinction was introduced, and lever presses no longer produced food The next day, the rats received two 10 min test sessions, one in context A and the other in context B No reinforcement for lever pressing was available during testing Responding was near zero when the rests were tested in the extinction context B In contrast when the rats were tested in the original context A, responding recovered substantially Suggests that the renewal effect is not due to the context reinforcer associations Bouton has suggested that contextual cues serve to disambiguate the significance of a CS

Johnson and colleagues (2003)

Response deprivation Classroom setting with students with moderate to severe mental retardation For each student, teachers identified things that the students were not likely to do The opportunity to engage in the less least likely behavior became an effective reinforcer for the least likely behavior if access to the least likely behavior was restricted below baseline levels This is contrary to Premack principle and shows that response deprivation is more basic to reinforcement effects than differential response probability

Premack Principle (Premack, 1965)

Responses "consummatory behavior" are special because they are more likely to occur than other behaviors Hungry animals are likely to eat whereas thirsty animals are likely to drink To predict what will be reinforcing, observe the baseline frequency of different behaviors Encourages thinking about reinforcers as responses rather than as stimuli Greatly expands range of activities investigators use as reinforcers; any behavior can serve as reinforcer provided that it is more likely than the instrumental response Highly probable behaviors will reinforce less probably behaviors OR: highly desirable behaviors will reinforce less desirable behaviors For example: Put a hungry rat into a skinner box If free food is available how much bar pressing behavior? But when food access is restricted and made contingent on bar pressing then what will the rat do? Bar pressing is low probability behavior while eating food is high probability behavior Make high probability eating contingent on low probability bar pressing

Winterbauer and Bouton (2011)

Resurgence effects Trained rats with only food reinforcement in experimental chamber that had two retractable levers Only one of the levers was in the chamber first, and the rats were reinforced for pressing this lever Training sessions continued until responding exceeded 20 responses per minute The rats then revived extinction with L2 over three sessions, reduced to 2.5 responses per minute Extinction of L1 continued in the next session 15 min into the session, the second response lever L2 was inserted into the chamber Responding on L2 was reinforced on a continuous reinforcement schedule until the rats obtained 20 reinforcers Extinction was introduced for L2 Results: Responding on L1 during the final extinction/test session as a function of time before and after the refinement of L2 indicated by L2+ Some responding on L2 occurred at the start of the session, reflecting spontaneous recovery from the previous day's extinction session However, L1 responding declines close to zero by the time reinforcement for L2 was introduced When responding to L2 was out on extinction, L1 responding reappeared, illustrating the phenomenon of resurgence

John Pearce

Showed that many learning phenomena are consistent with this framework According to the configural cue approach, overshadowing reflects different degrees of generalization decrement from training to testing There is no generalization decrement for the control group when it is tested with the weak stimuli because that is the same as the stimulus it received during conditioning Considerable generalization decrement occurs when the overshadowing group is tested with the weaker stimuli For the overshadowing group, responding becomes conditioned to the ab compound, which is very different from a presented alone during testing Responding conditioned to aB suffers considerable generalization decrement The greater generalization decrement is responsible for the overshadowing effect

Sidman

Sidman Avoidance Procedure Less commonly known as free-operant avoidance Devised an avoidance conditioning procedure that did not involve a warning stimulus the interval between shocks in the absence or response S-S (shock-shock) interval Interval between the avoidance response and the next scheduled shock R-R (response-shock) interval

Kruse et al. (1983)

Some animals have CS1 - solid food during Pavlovian Conditioning Other animals have CS2 - sucrose in water during Pavlovian Then tested on food or sucrose rewarded instrumental responding Facilitation was greatest when outcomes were the same These results are inconsistent with the 2-process theory If only due to emotional state, would not have facilitation There is an emotional state plus a specific reward type

Forms of recovery from extinction

Spontaneous recovery Renewal Reinstatement Resurgence

Lashley and Wade (1946)

Stimulus generalization reflects the absence of learning rather than the transfer of learning Occurs if organisms have not learned to distinguish differences among the stimuli Considered the shape of a stimulus generalization gradient to be determined primarily by the organism's previous learning experiences rather than by the physical properties of the stimuli tested

Guttman and Kalish (1956)

Stimulus Generalization First reinforced pigeons on a variable interval schedule for pecking a response key illuminated by a yellow light with a wavelength of 580 nanometers After training, the birds were tested with a variety of other colors presented in a random order without reinforcement, and the rate of responding in the presence of each color was recorded Results: The highest rate of pecking occurred in response to the original 580 nm color But the birds also made substantial numbers of pecks when lights of 570 nm and 590 nm wavelength were tested This indicates that responding generalized to the 570 nm and 590 nm stimuli However, as the color of the stimuli became increasingly different from the color of the original training stimulus, progressively fewer responses occurred The results showed a gradient of responding as a function of how similar each test stimulus was to the original training stimulus = Stimulus generalization gradient

Spence's Theory of Discrimination Learning

Theory assumes that reinforcement of a response in the presence of th S+ conditions excitatory response tendencies to S+ Nonreinforcement of responding during S- conditions inhibitor properties to S- that serve to suppress the instrumental behavior Differential responding to S+ and S- reflects both conditioned excitation to S+ and conditioned inhibition to S-

Mowrer

Two process theory First And most influential solution to the problem Elaborated by Miller Two process theory assumed that two mechanisms are involved in avoidance learning Classical conditioning of fear to the CS The first is classical conditioning process activated by partings of the warning stimulus (CS) with the aversive event (US) on trials when organisms fail to make the avoidance response. The CS comes to elicit fear Treats conditioned fear as a source of motivation for avoidance learning Instrumental reinforcement of the avoidance response through fear reduction Fear is an emotionally arousing unpleasant state The termination of an unpleasant or aversive event produces negative reinforcement for instrumental behavior The second process in two process theory is based on such negative reinforcement Mowrer proposed that the instrumental avoidance response is learned because the response terminates the CS and thereby reduces the conditioned fear elicited by the CS

Reynolds (1961)

Were the pigeons pecking because they saw the white triangle or because they saw the red background? Results: One of the pigeons pecked more frequently when the response key was illuminated with the red light than when it was illuminated with the white triangle This shows that its pecking behavior was much more strongly controlled by the red color than by the white triangle By contrast, the other pigeon pecked more frequently when the white triangle was projected on the response key than when the key was illuminated by the red light For the second bird, the pecking behavior was more strongly controlled by the triangle

Pelloux, Everitt, & Dickinson, 2007

punishment in drug addiction model using rats trained to get cocaine reinforcer by pressing drug seeking lever several times then the drug delivery lever once either trained with 8 sessions or 22 sessions of reinforcement Then punished with shock for pressing drug seeking lever on half the trials rats with moderate (8) cocaine exposure and most of the extensive (22) exposure group reduced lever press however a subgroup of rats in the extensive (22) exposure were resistant to punishment other rats trained to get sucrose reinforcer using same procedures as cocaine rats punishment was effective in reducing lever-pressing behavior in all of the rats

Rescorla 1997

spontaneous recovery Well controlled experiment in which original acquisition was conducted with either a drop of sucrose or a solid food pellet delivered into cups recesses in one wall of the experimental chamber Infrared detectors identified each time the rat poked its head into the food cup Chamber was normally dark One of the unconditioned stimuli was signaled by a noise CS and the other was signaled by a light CS As conditioning progressed, each CS quickly came to elicit nosing the food cup, goal tracking, with the two CSs eliciting similar levels of responding Two extinction sessions were conducted with each CS, followed by a series of four test trials The experimental manipulation of primary interest was the interval between the end of extinction training and the test trials For one of the conditioned stimuli, an eight day period separated extinction and testing In the other stimulus, the test trials started immediately after extinction training Results During the course of extinction, responding declined in a similar fashion for S1 and S2 Responding remained suppressed during the test trials conducted immediately afterward with S2 However, responding substantially recovered for S2, which was tested eight days after extinction training- spontaneous recovery

ch 8

stimulus control


Set pelajaran terkait

Chapter 18 Vocab and the Age of the city

View Set

Ch. 14 nursing mgmt during labor and birth

View Set

02.08 How Can the Constitution Change?

View Set