Psychology 001-Chapter 7-Learning

Ace your homework & exams now with Quizwiz!

Homework: Instrumental (operant) conditioning

A dog learns that if he sits when he is told he will get a treat.

Homework: Match each reinforcer to its type. 1. belly rubs for the dog 2. tension relief for sore muscles

A. positive reinforcer B. negative reinforcer

Evaluative conditioning

changes our attitudes even if we aren't aware that we're learning.

instructions

means that you should tell the person what you want them to do and what they'll get for it.

production phase

we actually perform the model's actions.

Homework: What is the "dead man test"?

A term used to help define behavior: If a dead man can do it, it is not behavior.

Primary negative reinforcers

typically include aversive events such as heat and pain. In these cases, when you do something to successfully remove an aversive event, you will do that thing more often as in escape or avoidance.

innate

which means they are not a result of learning; we're born with them. Once started, these behaviors cannot be stopped.

neutral stimulus

does not naturally elicit a response.

reinforcement

the consequences of a response increase the probability of behavior

post-traumatic stress disorder (PTSD)

which involves intense fear. Exploding landmines and gunfire are conditional stimuli and their effects are unconditional stimuli. When soldiers return home, other loud noises like fireworks explosions and thunder can sound enough like what they heard in war to elicit fear as a conditional response. Participants with posttraumatic stress disorder produced more startle responses to the purple triangle and blue square combination than participants without. Jovanovic et al. concluded that posttraumatic stress disorder may be a fear conditioning disorder. Specifically, people with this disorder have a difficult time not being afraid once the fear-provoking stimulus has been removed.

Homework: Joseph had difficulty telling the difference between "da" and "ba." His teacher helped him emphasize this difference by pointing when he saw "da" and waving when he saw "ba." In addition, Joseph received an M&M when he pointed to "da" and a Gummi bear when he waved after seeing "ba." What did Joseph learn to associate according to Skinner?

"ba," waving, and Gummi bears. The antecedent, behavior, and consequence are all learned.

Avoidance

(Active) avoidance is a situation in which the aversive stimulus is not currently present but will occur unless you produce a response to cancel (or omit) the scheduled aversive event. We must often experience the escape situation before we will make an avoidance response. Example: I go to my physician when I am sick so the doctor's treatment will remove the illness—escape conditioning. I also go to my physician twice a year for checkups to detect problems early before they become unpleasant—avoidance conditioning. Key word is Prevention.

Homework: Sort these classical conditioning events in order from what happens first to what happens last.

1. Unconditional stimulus 2. Unconditional response 3. Conditional stimulus 4. Conditional response The stimuli elicit responses, and the conditional stimulus is informative when it occurs before the unconditional stimulus. If the response occurs after the unconditional stimulus, then it's the unconditional response--the conditional response never occurs after the unconditional stimulus.

excitatory conditioning

A conditional response should develop in the pairing methods in which a conditional stimulus precedes an unconditional stimulus. A conditioned stimulus acts as a signal that a particular unconditioned stimulus will follow.

Homework: Pavlovian (classical) conditioning

A dog learns that the sound of a garage door opening means her person has come home.

Homework: Social learning

A puppy learns to go to the bathroom outside by watching an older house trained dog get praise after going outside.

systematic desensitization

A type of exposure therapy that associates a pleasant relaxed state with gradually increasing anxiety-triggering stimuli. Commonly used to treat phobias.

B.F Skinner

Behaviorist that developed the theory of operant conditioning by training pigeons and rats.

Homework: Match the word to the term from this demonstration. 1. Conditional stimulus 2. Unconditional stimulus 3. Unconditional response 4. Conditional response 5. Pavlov

C. "Learning" D. Noise A. Responding to Noise E. Responding to the word "Learning" B. Irrelevant stimulant

Homework: Suppose that fear developed to stimuli on the left. To which stimulus category on the right might fear generalize in each case? Match the stimuli. 1. real snakes 2. real spiders 3. clowns 4. needles/shots

C. Rubber snakes D. Pictures of Spiders A. Balloon animals B. Doctors Each set of stimuli either have similar characteristics (real vs. fake snake) or regularly occur together (clowns and balloon animals).

Garcia and Koelling

Discovered taste aversion when looking at effects of radiation on rats. Rats became nauseous from the radiation, but since the taste of water from a plastic bottle was accidentally paired with this radiation, the rats developed an aversion for this water.

Escape

Escape is a situation in which the aversive stimulus is already present and a response removes or stops the otherwise ongoing aversive stimulus. Example: Most of us set an alarm, which is an aversive stimulus, to wake up in the morning. In order to stop and escape from that irritating sound, we have to press a button.

Homework: A cat is put in an odd puzzle box¸ but after trial and error is able to escape and then can escape quickly in subsequent trials?

Learning

Biological preparedness

Seligman (1971) proposed that some stimuli are more likely than others to become conditional stimuli, a concept he called biological preparedness. Also cue-consequence learning and belongingness.

Homework: After developing a fear of the white rat, Little Albert also exhibited fear responses to other white objects that had not been paired with loud noise. This illustrates which phenomenon associated with classical conditioning?

Stimulus generalization

appetitive stimuli

a pleasant or satisfying stimulus that can be used to positively reinforce an instrumental response. Example: freshly baked cookies and romantic partners.

reinforcer test or contingency analysis

a preference assessment, is a strategy that can be used by classroom teachers to determine the items, activities, and events that a student finds reinforcing. If the frequency of behavior decreases or stays the same, then we are not dealing with a reinforcer

Pavlovian Taste Aversion Learning

a type of trace conditioning. The taste of food is separated from sickness by several hours, and yet, we will feel nauseated the next time we smell or taste that food. We quickly learn to avoid eating food that smells or tastes (or looks) like the food that made us sick. This type of Pavlovian conditioning can develop with a single pairing.

Latent learning

another learning phenomenon, occurs when we learn something but don't show it until we have a reason to use our new knowledge.

Antecedents

are anything in the physical environment that we can detect and tells us something about the consequences of our actions. Other people, inanimate objects, and signs.

Primary (or unconditioned) reinforcers

are not learned; they naturally affect responses they follow. Primary positive reinforcers generally are stimuli/events needed to maintain life: food, water, air, warmth when cold, and sleep. Sex and drugs are primary reinforcers as well because they produce physiological responses (e.g., oxytocin and dopamine).

Positive reinforcers

are produced (added) by the response. Trophies, money, praise, and food are often positive reinforcers given when a response occurs.

negative reinforcers

are removed (or omitted) by the response. They're around when the response occurs and are taken away by it. That is, they are the absence of something. Putting on sunglasses in the bright sun means that you removed the harsh light (negative reinforcer).

Consequences

are stimuli that can increase or decrease the probability of future behavior. More specifically, they are simply events that happen after and because of a response.

Schedules of reinforcement

are the rules that we use to determine when we get reinforcers for behavior.

target behavior

because we are only interested in the effects of the consequence on that response, not all the different responses that the person could emit.

punishment

decreases the probability of behavior.

elicits

evoke or draw out (a response, answer, or fact) from someone in reaction to one's own actions or questions.

fixed ratio

fixed ratio (FR) schedule, we must produce the target response a specific number of times, and the last response produces a reinforcer. This sequence then "resets" the counter to the same response requirement. A fixed ratio schedule is characterized by a break-and-run pattern of responding.

B. F. Skinner

founded radical behaviorism—the philosophy of science that treats thinking and feeling like any other behavior. We just have to be able to measure behaviors and see their effects on the environment

conditional stimulus

in classical conditioning, an originally irrelevant stimulus that, after association with an unconditioned stimulus, comes to trigger a conditioned response. Only after learning has occurred.

reflexes

involve the occurrence of an environmental event which triggers corresponding behavior.

motivational phase

our imitated behavior produces the same reward that the model earned. If we earn the same reward that we thought we would, we're more likely to repeat this behavior in the future.

aversive stimuli

physical or psychological discomfort stimuli that an organism seeks to escape or avoid. Example: spoiled food and extremely hot surfaces.

Homework: Shariq is a young boy who has developed the habit of throwing rocks. His father started counting the number of times Shariq throws rocks (baseline). Shariq threw rocks 4 times on Monday, 4 times on Tuesday, and 5 times on Thursday. Starting Friday, throwing rocks resulted in a reprimand. Shariq threw rocks 10 times on Friday, 12 times on Saturday, and 12 times on Sunday. Although Shariq's father thinks that reprimands could be _______ , it would appear, based on Shariq's behavior, the reprimands are actually acting as _______ for rock throwing.

positive punishment; positive reinforcement

contingencies

possible outcomes; different plans based on varying circumstances. "if-then" relationships between responses and their consequences.

Ratio

schedules deliver reinforcers after a specific number of responses—either the same number of responses per reinforcer (fixed) or a different number of responses per reinforcer (variable).

Interval

schedules deliver reinforcers after at least two responses and a specified amount of time. The first response starts a timer, and the next response after the timer finishes produces a reinforcer.

Positive punishment:

some behavior produces a stimulus which leads to less of that kind of behavior in the future; positive because of the added consequence and punishment because of the effect on behavior.

trace conditioning

the unconditional stimulus occurs minutes or hours after the conditional stimulus has stopped.

simultaneous conditioning

the unconditional stimulus occurs with the start of the conditional stimulus.

short-delayed conditioning

the unconditional stimulus occurs within a few seconds of the start of the conditional stimulus.

The Premack principle

uses access to an activity reinforcer to increase the future probability of behavior. This arrangement creates a reinforcer because the only way we can perform a preferred activity (i.e., reinforcer) is to first perform a less enjoyable activity (i.e., response). Based on how often two behaviors occur. If behavior A occurs more frequently than behavior B, we can require someone to perform behavior B before they can access behavior A

variable ratio

variable ratio (VR) schedule, we have to respond a different number of times for each reinforcer. The number of responses changes (i.e., varies) around an average for each reinforcer. A variable ratio schedule is characterized by a high (fast) and constant pattern of responding. We don't know when we will get a reinforcer, so we keep responding to maximize the number of reinforcers we earn. It's this unpredictable delivery of reinforcers that makes responding constant; there is no pause.

operant conditioning

when behavior is affected by its consequences.

fixed interval

(FI) schedule, a response starts a timer, the specific amount of time must elapse, and the next response will trigger the delivery of a reinforcer. How much time we have to wait is the same for each reinforcer in fixed interval schedules. A fixed interval schedule is characterized by a scallop pattern of responding. We respond little at the beginning of the interval and respond increasingly faster toward the end of the interval. Once we get a reinforcer, we take a break (the shallow part of the scallop). It's the seashell cumulative record that produces an increasing (but slower than ratio) response rate. Example: Your studying might be on an FI schedule. Classes with scheduled quizzes every Friday induce little studying Monday through Wednesday and an increasing rate of studying on Thursday (and Friday) in anticipation of the quiz.

variable interval

(VI) schedule, we have to wait different amounts of time for each reinforcer. We still have to respond once to start the timer and once again to produce a reinforcer after the time ends. Some intervals are longer, and some are shorter than the previous one. ​​A variable interval schedule is characterized by a slow and constant pattern of responding. Variable interval schedules produce the third-highest rate of responding. Example: You never really know when you'll get a new email, so you keep checking fairly regularly.

Mowrer's two-factor theory

(a) Pavlovian conditioning to generate fear to the tone and (b) instrumental/operant conditioning to establish escape from fear as the consequence for lever pressing.

Negative punishment

(also response cost, omission, and time-out): some behavior removes a stimulus which leads to less of that kind of behavior in the future; negative because of the removed consequence and punishment because of the effect on behavior.

Homework:As she entered the theatre, Marie saw the sign below. The sign is supposed to function as what? Multiple answers may be correct. (Picture of a Red Circle crossing out a cell phone)

-Notification to turn cell phones off -Discriminative stimulus for reinforcement for turning off a cell phone

phobias

-can be learned in a single trial -can persist even when we know that the feared object is harmless -are of things that could harm our ancestors (like a nonhuman animal) that we probably won't -encounter -do not extinguish quickly or easily (see also McNally, 1987)

negative reinforcement include:

-taking a long route home to avoid a scary dog -quietly leaving a party to avoid a person you dislike -adding a phone number to your blocked calls list to avoid that caller -turning off a light when you leave a room to avoid a high electric bill

extinction—a response previously produced a consequence, but that consequence is no longer provided. This is the essence of extinction—a response previously produced a consequence, but that consequence is no longer provided. There are three behavioral effects of extinction:

-temporary increase in responding—an extinction burst -emotional and aggressive responding -responding eventually stops

positive punishment include:

-trying to eat your brother's leftovers and getting yelled at -answering a call then speaking with a telemarketer -checking your email and receiving a complaint -touching a metal surface in the winter and receiving a static shock

positive reinforcement include:

-turning the keys in your ignition to start your car's engine -answering a call to speak with a potential employer -checking your email to receive a message from a friend -flipping a switch to turn on a light when you walk into a room

negative punishment include:

-yelling at your sister then losing your allowance -driving while intoxicated then losing your license -talking in class and being sent into the hallway alone -lying to your friend then your friend temporarily stops talking to you

Homework:Sort the schedules of reinforcement in order from lowest to highest rate of responding.

1. Fixed interval 2. Variable interval 3. Fixed ratio 4. Variable ratio

Homework:Sort these Pavlovian conditioning components in order for backward conditioning.

1. Unconditional stimulus 2. Unconditional response 3. Conditional stimulus

Bandura's (1977) theory specifies that observational learning entails four phases/processes/stages:

1. attentional 2. retention 3. production 4. motivation.

Homework: Tanya has been having difficulty completing her math homework assignments, and her mother decided to assist her. Immediately after dinner, Tanya finished the first problem in 2 minutes and received a potato chip from her mother; Tanya's mom knows she likes chips. Tanya was left to finish more problems on her own, but she still hadn't completed any after 15 minutes. Match each premise with a correct response based on this scenario. 1. What is the reinforcer? 2. Why might this not function as a reinforcer? 3. When would be the best time to use this reinforcer?

A. chips B. Tanya wasn't hungry E. before dinner

Homework: Match the clicker training components to their respective Pavlovian conditioning labels. 1. clicker 2. salivate to clicker 3. treat 4. salivate to treat

A. conditional stimulus B. conditional response D. unconditional stimulus C. unconditional response

Homework: Garcia and Koelling's rats drank sweetened, bright, and noisy water and received either shock or irradiation poisoning. Identify the parts of taste aversion learning for Group 2 (irradiation) rats. 1. sweet taste 2. nausea 3. irradiation (x-ray)

A. conditional stimulus C. conditional and unconditional response B. unconditional stimulus

Homework: Suppose you go to the eye doctor, and she has you sit in front of a machine that puffs air into your eye. You blink to the puff of air, but you also start to blink before the puff of air while sitting in front of the machine. Match each part of visiting the eye doctor to its Pavlovian conditioning label. 1. machine with pressurized air 2. blinking to the machine 3. puff of air 4. blinking to the puff of air

A. conditional stimulus D. conditional response B. unconditional stimulus C. unconditional response

antecedent stimuli

An environmental condition or stimulus change existing or occurring prior to a behavior of interest.

Homework: What is the response that indicates that flavor aversion learning is occurring for cancer patients?

Anticipatory vomiting and nausea. Cancer patients get sick and attribute that sickness to the food they ate (pre-treatment). If they get sick immediately after treatment but before they eat, that's post-treatment vomiting and nausea.

Homework: You're out shopping for toothpaste, and you notice the pictures on several different brands. Brand A has a picture of an extremely happy celebrity, Brand B has a picture of gum disease (as a warning for not brushing), Brand C has a picture of starving children (because they're collecting donations), and Brand D has no picture. Based purely on the pictures, which brand of toothpaste are you most likely to purchase according to evaluative conditioning?

Brand A because the name of the toothpaste is paired with something pleasant; the others are all paired with something unpleasant (Brands B & C) or nothing (Brand D).

Homework: As she walked through her neighborhood, Jodie, a 6-year-old who has a cocker spaniel, saw a large brown dog. She walked up to the dog to pet it, and as her hand approached the dog's head, it bit her on the hand. Jodie immediately began to cry and exhibit other fear responses (e.g., rapid heartbeat, "butterflies" in her stomach, etc.). Now when she sees any large dog, she gets nervous and frightened. Match the correct definition to the specifics of the event. 1. Neutral stimulus 2. Conditional stimulus 3. Unconditional stimulus 4. Conditional response

C. Cocker Spaniel D. Large Dog B. Bite on Hand A. Anxiety and Fear

flavor conditioning/ Conditioned taste aversion

Conditioned taste aversion occurs when an animal associates the taste of a certain food with symptoms caused by a toxic, spoiled, or poisonous substance. ... It is an example of classical or "Pavlovian" conditioning.

discriminative stimuli for punishment

Cues that signal a noxious stimulus.

discriminative stimuli for reinforcement

Cues that signal a reward.

discriminative stimuli for extinction

Cues that signal no upcoming outcome.

Homework: Sherry likes to watch television. She has noticed that when a show is interrupted by a commercial, most of the time the sound level of the commercial is much higher than the sound level during regular programming. Now when the commercial starts, she immediately grabs her TV remote and lowers the sound. Her response of lowering the sound level during commercials is being controlled by which operant process?

Escape

Homework: As a young child, whenever Janie had to go to bed at 9:00 pm she had a tantrum, and her parents tried to quiet her down. When she turned 8, though, her parents stopped attending to her when she began to scream and cry when they put her to bed at 9:00 pm. Six months later, Janie no longer exhibits tantrums. In the example, the elimination of Janie's tantrums was produced by which operant process?

Extinction

Homework: In the movie A Clockwork Orange, Alex receives aversion therapy—pairing a nausea-inducing drug with violent films and Beethoven's music—so that he might not have violent thoughts and commit violent crimes once released from prison. Alex does become ill when he thinks about or is put into situations in which he could be violent, but he eventually stops feeling ill once he is forced to listen to Beethoven for many hours. What Pavlovian phenomenon has occurred to produce Alex's reduction in responding?

Extinction

transferred association

In order to copy the behavior of another, the observer must see the model's behavior and the model earn a reward for that behavior. Basically, if we see someone do something and they're successful, then we're more likely to imitate that successful behavior.

Homework: Which of the following is true for flavor aversion learning?

It mostly occurs with new foods. It doesn't occur with familiar foods because we've safely eaten them many times before without getting sick. With new flavors, all we know about them is that they made us sick.

Who discovered Pavlovian (classical) conditioning?

Ivan Pavlov, who was a Russian physiologist. His work provided a basis for later behaviorists like John Watson and B.F. Skinner.

Homework: Kasey's boss has been unnecessarily harsh to her ever since she started working at the company. He finally gave her a chance to prove that she isn't incompetent, but she didn't embrace the opportunity like he thought she should. What is a likely explanation for Kasey's behavior?

Learned helplessness Kasey presumably did try to impress her boss when she first started working at the company with no success, and she feels that any attempt to impress him now would also be unsuccessful.

Homework: While showering you jump out of the way of the water when you hear a toilet flush?

Learning

Homework: You make sure you do your chores before your mom gets home so you can go to the movies?

Learning

Homework: Your cat comes running whenever you shake a bag of treats?

Learning

Homework: Your dog thinks you are going to take him out when you get up in the morning to use the bathroom?

Learning

Homework: Some parents tell their children that misbehavior will make Santa take away their presents. What operant contingency is this?

Negative punishment

Homework: An infant cries when s/he is hungry?

Not learning

Homework: Children are afraid of loud sounds?

Not learning

Homework: Darnell likes to watch YouTube videos. YouTube used to have no commercials, but now Darnell has to watch at least five commercials every time he wants to watch a 3-minute video. He doesn't like commercials, and he rarely ever watches YouTube videos anymore. Which operant process is responsible for this change in Darnell's behavior?

Positive punishment

Homework: Which of the following best defines a variable interval schedule of reinforcement?

Reinforcing the target behavior after an average amount of time has passed.

Homework: Neil saw his dad shake hands with a friend, and now Neil shakes hands with everyone he sees. How did Neil learn to shake hands with others?

Social learning Shaking hands is voluntary behavior, and it wasn't specifically shaped or reinforced. Neil saw someone else do it first and imitated that response. (Only later is the response reinforced.)

Homework: Negative reinforcement is negative in the sense that...

The behavior results in the removal of a consequence stimulus.

applied behavior analysis

The science in which tactics derived from the principles of behavior are applied to improve socially significant behavior and experimentation is used to identify the variables responsible for the improvement in behavior.

Homework: Psychology has identified three major types of learning. Which of the following is true in reference to the three types of learning?

The three types of learning are classical conditioning, operant conditioning, and social learning.

homework: George wears his lucky socks every time his favorite team plays. They only win half the time he wears the socks. What is the relationship between George's lucky socks and his team winning?

There is no relationship. While wearing the lucky socks and winning could be considered simultaneous conditioning, that arrangement ignores the other "trials" in which lucky socks (~conditional stimulus) are present but winning (unconditional stimulus) is not.

E. L. Thorndike (1911)

Thorndike is credited in paving the way for behaviorism to flourish. Thorndike is best known for his work with cats in puzzle boxes. Specifically, he would put cats into a box that required a certain set of behaviors to escape. He wanted to know how long it would take cats to solve the puzzle. Because cats learned how to manipulate an instrument such as a pedal, Thorndike called this type of learning instrumental. This is why operant conditioning is also called instrumental conditioning.

law of effect

Thorndike's principle that behaviors followed by favorable consequences become more likely, and that behaviors followed by unfavorable consequences become less likely.

Homework: What is the purpose of this warning sign? [6]

To engage sign readers in avoidance (i.e., avoid cutting in line). The sign serves as a warning that people who cut will be kicked out; the hope is that the sign will deter them from cutting in line. Actually kicking them out is negative punishment (loss of fun in an amusement park). Attendees can passively avoid punishment by staying in their proper place in line.

John B. Watson

Watson is most famous for developing a fear response to many objects in a 9-month-old baby—referred to as Little Albert to protect his identity. After several pairings of the noise with the sight of the white rat, Little Albert would start crying at just the sight of the white rat; the rat had become a conditional stimulus for the loud noise as an unconditional stimulus. Little Albert demonstrated stimulus generalization—crying and crawling away from objects similar to the white rat (i.e., white rabbit, fur coat, Watson in a Santa mask).

learned helplessness

When we experience an aversive stimulus, usually involving pain, the aversive event/unconditional stimulus activates the sympathetic division of the autonomic nervous system. The resulting unconditional responses (e.g., increased respiration and heart rate, release of hormones from the adrenal glands, etc.) provide us with energy and motivation to escape or avoid the situation. When our avoidance responses don't work, we stop trying to get away from the aversive stimulus; we've acquired learned helplessness. For example, a student failing a math class may try several different strategies (e.g., longer and more frequent study sessions; using different study techniques, etc.) to escape/avoid failing class. If these attempts do not work, the student may simply stop trying to succeed in math—learned helplessness to a specific situation—while continuing to study for other classes.

spontaneous recovery

a conditional stimulus presented alone after a rest period will elicit a conditional response. Spontaneous recovery is partly why it's so difficult to get rid of fear.

Pavlovian (or classical) conditioning

a form of associative learning whereby a neutral stimulus is paired with a salient stimulus so that eventually the neutral stimulus predicts the salient stimulus. Occurs when we associate two events. Pavlovian conditioning involves stimuli and responses: One stimulus signals another, and both stimuli cause responses. Pavlovian conditioning (or learning) has occurred when an involuntary response is elicited by a formerly neutral stimulus.

Operant (or instrumental) conditioning

a form of associative learning whereby behavior is modified depending on its consequences. Is how we learn what happens when we do something.

higher-order conditioning

a neutral stimulus is systematically and repeatedly paired with a conditional stimulus that reliably elicits the conditional response. A person who is afraid of clowns after a bad experience might then become afraid of red balloons if they are repeatedly presented with clowns.

Reinforcers

are events or stimuli that follow behavior and increase the future likelihood of that kind of response. (Reinforcers are stimuli and reinforcement is a process [i.e., how behavior changes] or a procedure [i.e., if this response, then this consequence].)

partial reinforcement extinction effect

behavior exposed to a continuous reinforcement schedule will stop faster without reinforcement than behavior exposed to an intermittent reinforcement schedule.

Secondary (or conditioned) reinforcers

both positive and negative, acquire their capacity to affect responses they follow because they signal or have been associated with primary or already-conditioned secondary reinforcers. secondary reinforcers are not universal. Examples: -consumables - food not eaten for nutrients (e.g., junk food) -tangibles - objects you can touch (e.g., toys)

stimulus

can be anything in the environment that 1) we can detect, 2) is measurable, and 3) can evoke a response or behavior.

Operant (Instrumental) conditioning

describes situations in which we can choose among different options based on our previous experiences. That is, we learn that our behavior has consequences.

There are four main schedules of (intermittent) reinforcement:

fixed ratio, variable ratio, fixed interval, and variable interval.

unconditional stimulus

in classical conditioning, a stimulus that unconditionally—naturally and automatically—triggers a response.

unconditional response

in classical conditioning, an unlearned, naturally occurring response to an unconditioned stimulus. An innate reflex.

conditional response

in classical conditioning, the learned response to a previously neutral (but now conditioned) stimulus.

Satisfaction or stamping

in means that we associate a situation with behavior when that behavior leads to something pleasant.

Tolman and Bandura

in particular reintroduced mental events to the study of behavior, and this tradition of hypothesizing unobservable causes for behavior continues today. Bandura did for research with people what Tolman did for research with nonhuman animals (e.g., rats and pigeons).

inhibitory conditioning

in which a conditional response is suppressed or no conditioning at all. However, the unconditional response still occurs to the unconditional stimulus.

Stimulus discrimination

involves responding differently to different events. It's the opposite of stimulus generalization. In stimulus discrimination, conditional responses only occur when the original conditional stimulus is introduced. Similar stimuli do not elicit a response.

Stimulus generalization

involves responding similarly to conceptually or physically similar stimuli. In other words, an event that has not been paired with the unconditional stimulus also elicits or causes the conditional response. This generalization can be natural or conditional. Stimulus generalization tends to happen with phobias; a person may be bitten by a brown recluse and develop fear of all spiders. A fear response conditioned to one arbitrary geometric figure would generalize to the others

Shaping

involves selecting and reinforcing more complex responses that look like the response you want while extinguishing simpler forms of the target response.

cognitive map

is a detailed representation of the physical environment and all possible routes that we can use when deciding where to go

Extinction

is a procedure in which a consequence previously followed behavior but now no longer does. As a process, responding is less likely to occur in the future without that consequence.

Behaviorism

is an approach to science that focuses on how we learn new behavior and how those behaviors change across different situations.

Behavior

is anything that an organism can do that is affected by the environment, can be repeated and counted, and affects the environment.

Edward C. Tolman

is credited with the establishment of cognitive psychology. His approach to studying behavior is called mediational neobehaviorism or operational behaviorism. Tolman reintroduced mental events in the form of cognitive maps that occurred between, or mediated, environmental stimuli and behavior (Moore, 2010). These unobservable, mental events were characteristics of the organism like expectations, mood, or attitudes that acted on behavior directly.

learning

is defined more broadly as a relatively permanent change in behavior not due to drugs (the first administration), maturation/development, injury, or disease.

Latent learning

is learning that we can't see until we're motivated to show it; that is, there is no change in our performance until we receive a reward.

Social (or vicarious) learning

is when we learn something by watching others.

extinction

loss of associative strength in a weaker conditional response over time in a process. Pavlovian extinction as a procedure involves repeatedly presenting a conditional stimulus without an unconditional stimulus.

Power

means that the consequence should be big enough to support the behavior.

Immediacy

means that the consequence should be delivered soon after the response. If there is a delay of more than 30 seconds before the consequence, the event will be less effective than if it were presented immediately

Contingency

means that there should be an if-then relationship between the response and the consequence; if the response occurs, then and only then does the consequence occur. No freebie consequences should be given.

Discomfort or stamping out

means that we do not associate a situation with behavior when that behavior leads to something unpleasant.

Positive reinforcement:

some behavior produces a stimulus which leads to more of that same kind of behavior in the future; positive because of the added consequence and reinforcement because of the effect on behavior.

Negative reinforcement:

some behavior removes a stimulus which leads to more of that kind of behavior in the future; negative because of the removed consequence and reinforcement because of the effect on behavior.

​Litt and Schriebman (1981)

tested whether we learn about the antecedents and consequences of behavior, as Skinner proposed, when they taught children with autism how to label different items with differential versus non-differential consequences.

Extinction burst

the target response often increases in frequency/duration/intensity, before beginning a slow deceleration.

backward conditioning

the unconditional stimulus occurs a few seconds before the start of the conditional stimulus.

long-delayed conditioning

the unconditional stimulus occurs after the conditional stimulus has been there for a while.

observational or social learning

we learn from other people.

attentional phase

we must notice the model's behavior. Additionally, we are more likely to imitate the model when we like and respect that person.

retention phase

we think about performing the model's actions ourselves.

classical conditioning

when environmental events trigger our behavior.

phobias

which are intense, unrealistic fears directed toward people, objects, or situations. The fear we experience with phobias is more intense than it should be to any realistic threat.

imitation

your friend does what you just did. You are the model or demonstrator, and your friend is the observer who imitates you (the model). The observer doesn't have to imitate the behavior immediately; it may be some time later when the observer has an opportunity to use this new knowledge in a similar situation. Bandura concluded that children who had observed adults imitated what they saw the adults do (see Figure 7.45) and emphasized that observational learning does not have to actually change the observer's behavior. Instead, it can influence what we might do—a cognitive orientation.


Related study sets

MSII Prep U Ch. 72 Emergency Nursing

View Set

Unit 13 and 14 World War I - Causes, Russia's involvement, the US enters the War, WW I map study

View Set

unit 6 progress check : mcq part a&b

View Set

Chapter 58 Substance Abuse Prep U

View Set

Chapter 4: Operational, Financial, and Strategic Risk

View Set

PSL 250 Nervous System (CNS) Exam 2 Spranger MSU hji

View Set

Ch 41 Fluid, Electrolyte, & ABB

View Set

concepts and vocabulary chapter 2

View Set

Intermediate Accounting 2 Multiple Choice

View Set