PSYCH 320 exam #1

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

1.) Describe what reflexes are and how researchers have conducted experiments on them.

A reflex is an involuntarily and automatic response hardwired into an organism; it is present in all normal members of a given species and does not have to be learned. Like Pavlov's dogs, humans salivate when eating food. Newborns suck when they encounter a nipple (sucking reflex), hold their breath when submerged underwater (diving reflex), and grasp a finger so tightly that they can support their own weight by hanging on to it (palmar grasp reflex). Charles Sherrington (English physiologist) conducted many studies on dogs whose spinal cords no longer received brain signals. They still show basic reflexes (jerking their leg away from a painful stimulus) and because the brain cannot contribute to these reflexes, the spinal cord is generating it alone. We now know that sensory inputs can activate motor fibers traveling out of the spinal cord without waiting for signals from the brain. Sherrington concluded that such simple spinal reflexes could be combined into complex sequences of movements and that these reflexes were the building blocks of all behavior

1.) Describe how the insular cortex and dACC contribute to punishment.

A region of cortex lying in the fold between parietal and temporal lobes that is involved in conscious awareness of bodily and emotional states and may play a role in signaling the aversive value of stimuli.

1.) Explain how sensitization to stress can lead to anxiety and depression.

After an initial stressful event triggered a disorder (depression) increasingly minor stressful events could trigger additional bouts of depression later on. This tendency occurs because some individuals become sensitized to stress and its associated physiological states ~ recent studies show that depressed people show stronger responses to minor stressors than healthy individuals do ~ also, depression promotes repetitive thoughts about symptoms and neg events and causes them to continue reacting to low level stressors. Low rates of exposure to stressful events may lead, however, to greater sensitivity to future stressors and with moderate stress over time may increase resilience when faced with a high stress event. Anxiety ~ like depression causes individuals to focus on factors they perceive to be contributing to their depressive state, individuals with anxiety disorders tend to repeatedly focus on certain thoughts that lead them to repeat certain actions many times. Such repetition may cause further sensitization and worsen the problem.

xplain the role of opioids in reinforcement.

Any of a group of naturally occurring neurotransmitter-like substances that have many of the same effects as opiate drugs such as heroine and morphine; may help signal hedonic value of reinforcers in the brain.

1.) Describe the neural circuits involved in habituation and sensitization in Aplysia.

Aplysia - group of marine invertebrates/sea slugs/sea hares. They have a relatively simple nervous system; neuroscientists have documented each of the neurons involved in Aplysia's gill-withdrawal reflex... when a siphon is touched, sensory neuron S fires, releasing a neurotransmitter - glutamate - into the synapse. Molecules of glutamate diffuse across the synapse to activate receptors in motor neuron M. if enough receptors are activated, neuron M fires, and causes the muscles to retract the gill for a few secs. Habituation - the amount of habituation is proportional to the intensity of the stimulus and the repetition rate. If a sufficiently light touch is delivered every minute, the withdrawal response habituates after 10 or 12 touches, and the habituation can last for 10 to 15 minutes. you can see what is causing this habituation - recall that touching the siphon excites sensory neuron s, which releases the neurotransmitter glutamate, which in turn excites motor neuron m, which drives the withdrawal response. with repeated stimulation, however, neuron s releases less glutamate decreasing the chance that neuron m will be excited enough to fire. the reduction in glutamate release is evident even after a single touch and lasts for up to 10 minutes. this decrease in neurotransmitter release is associated with a decrease in the number of glutamate-containing vesicles positioned at release sites. thus, in Aplysia, habituation can be explained as a form of synaptic depression, a reduction in synaptic transmission. this is exactly the sort of weakening of connections that is proposed by the dual process theory of habituation (see figure 3.4b) and that is thought to contribute to the long-term depression of neural connections. an important feature of habituation in sea hares is that it is homosynaptic, which means it involves only those synapses that were activated during the habituating event: changes in neuron s will not affect other sensory neuron. In other words, a light touch to the tail or upper mantle still elicits the defensive gill withdrawal, even though a touch to the siphon is ignored. the responsiveness of the motor neuron m is not changed. In this case, habituation in the short term affects only how much neurotransmitter neuron s releases. synaptic transmission in Aplysia can thus be depressed not only by decreases in neurotransmitter release but also by the elimination of synapses. this suggests that repeated experiences can lead not only to the weakening of connections, as suggested by the dual process theory of habituation, but also to their elimination. Sensitization - suppose, instead of a light touch to the siphon, the researcher applies a more unpleasant stimulus: a mild electric shock to the tail that causes a large, sustained gill-withdrawal response. the aversive tail shock sensitizes subsequent responding, so that a weak touch to the siphon now produces a strengthened gill withdrawal. A tail shock activates sensory neuron T, which activates motor neuron M, causing the motor response. T also activates an interneuron IN, which delivers a neuromodulator (such as serotonin) to the axons of neurons S and U. (b) Subsequent activation of neuron S will cause a larger release of neurotransmitter (glutamate), leading to greater activation of neuron M than was previously evoked by S.

Describe the cellular-level influences on conditioning in Aplysia (focus on lecture

Aplysia's siphon-withdrawal reflex can be classically conditioned... can make light touch come to produce a strong withdrawal (& facilitate serotonin released by US pathway like a shock) even when before it only produced little or no withdrawal

1.) Distinguish between appetitive and aversive conditioning.

Appetitive Conditioning - when the US is a positive event (food delivery or sex); learning to predict something that satisfies a desire or appetite. Aversive Cond - the US is a negative event (shock or air puff to the eye); learning to avoid or minimize the consequence of an expected aversive event (like predicting clouds can produce rain damage to his home if his windows are not closed).

1.) Identify the major parts of neurons and their functions.

Axon terminals - transmitters Schwann's cells - makes myelin Node of Ravier - gap in the myelin on the axon - serves to facilitate the rapid conduction of nerve impulses Axon - conducting fiber Myelin sheath - insulating fatty layer that speeds transmission Dendrites - receivers Cell body Nucleus - 3 main parts of the nucleus - dendrites (receive information), axons (send information), and the cell body/soma

1.) What is the difference between "wanting" and "liking" in the brain? Why are both necessary for operant conditioning?

Because Vta stimulation was such a powerful reinforcer, some researchers inferred that the rats "liked" the stimulation, and the Vta and other areas of the brain where electrical stimulation was effective became informally known as "pleasure centers." however, the idea of "pleasure centers" is something of an oversimplification. For one thing, rats lever pressing for electrical brain stimulation don't tend to act as if they're enjoying it; they tend to become agitated and may bite the lever instead of simply pressing it, or even scratch the walls or show other behaviors such as eating, fighting, or shredding of nesting material. this is more like the behavior of an excited animal than one who is enjoying food. skinner, of course, would caution that we can't infer what an animal might be feeling just by watching its behaviors. nevertheless, some researchers have suggested that electrical brain stimulation causes not pleasure but rather excitement or anticipation of reinforcement—much like the anticipation we experience when expecting a good meal or a big present. currently, many researchers believe that we have separate brain systems for signaling hedonic value—meaning the subjective "goodness" of a reinforcer, or how much we "like" it—that are distinct from those signaling motivational value—meaning how much we "want" a reinforcer and how hard we are willing to work to obtain it. no matter how much we may "like" chocolate cake, most of us will not be very motivated to obtain more if we have just eaten three slices; similarly, olds's rats doubtless still "liked" food and rest, but they were more motivated to obtain electric brain stimulation, even when starving and exhausted. in these examples, provision of a "liked" reinforcer isn't enough to evoke responding. only when "wanting" and "liking" signals are both present will the arrival of the reinforcer evoke responding and strengthen the s à r association.

6) The innerworkings of the mind took a backseat when Behaviorism gained popularity. Why? What lessons has cognitive and other areas of psychology taken from Behaviorism?

Behaviorism argued that psychology should just restrict itself to the study of observable behaviors and avoid reference to unobservable and often ill-defined internal mental events. Behaviorists wanted to distance themselves from philosophers and psychologists who explored the inner workings of the mind through personal introspection and anecdotal observation. They wanted psychology to be taken seriously as a rigorous branch of natural science like biology or chemistry.

1.) Describe the behavioral responses used to study habituation and sensitization.

Behaviorist approach defines habituation as a decrease in behavior; focusing on behavior allows objective standardized measurements (force of startle). IE: student reading this textbook (you are able to notice repetition in the text).... This can lead to habituation. This repetition can becoming annoying (beginning to understand what sensitization then)

1.) Describe the blocking effect and its theoretical significance.

Blocking effect - demonstrates that classical conditioning occurs only when a cue is both a useful and a nonredundant predictor of the future; A two-phase training paradigm in which prior training to one cue (CS1 US) blocks later learning of a second cue when the two are paired together in the second phase of the training (CS1 1 CS2 US). Phase 1- rats in chamber are exposed to light (CS) which is then followed by a shock (US). Phase 2 - a tone (CS) is combined with a light (CS) which is then followed by a shock (US). Phase 3 - when JUST the tone (CS) is presented, the rats DO NOT respond with a 'shock' response (CR).

1.) Describe the difference between the central and peripheral nervous systems.

Central - brain and spinal cord. Where most of cognition takes place. Integration of info from PNS Peripheral - nerve fibers that connect sensory receptors (ie: visual receptors in the eye or touch receptors in the skin) to the CNS and of other fibers that carry signals from the CNS back out to the muscles and organs.

1.) Describe the functions of the cerebellum, brainstem, thalamus, basal ganglia, hippocampus, and amygdala.

Cerebellum - responsible for regulation and coordination of complex voluntary muscular movement including classical conditioning of motor-reflex responses; not part of the cortex; learned physical actions Brainstem - group of structures that connect the rest of the brain to the spinal cord and plays key roles in regulating automatic functions like breathing, body temp, heart rate, blood pressure; also not part of the cortex Thalamus - part of the diencephalon; connects separate parts of the brain (pre-integration station). Receives various sensory signals (associated with sight, sound, touch, etc) and that connects to many cortical and subcortical regions; a gateway through which almost all sensory signals can affect brain activity). Basal ganglia - responsible for planning and producing skilled movements like throwing a football or juggling; fine tuning of muscle movements; influenced mainly through preventing unwanted movement. Hippocampus - part of the limbic system inside the temporal lobe; important for learning new things or remembering past events. Encoding new memories (reactivated when retrieving old memories and especially important for memory binding) - memories start here and then are moved around the other brain areas. Amygdala - emotion/emotional memories *(very connected to the hippocampus). Part of the limbic system too. Connects emotion to memory (particularly fear conditioning and emotional learning). Feedback loop between the amygdala and the hippocampus influences binding.

1.) Describe the activity of neurotransmitters at the synapses. That is, how do neurons talk to each other at the synapse?

Communicating neurons are separated by a narrow gap (the synapse) which neurons transmit chemicals. Most synapses are formed btw the axon of the presynaptic (sending) neuron and a dendrite of the post synaptic (receiving) neuron. Neurons control neurotransmitters - which are chemical substances that can cross a synapse to affect the activity of a postsynaptic neuron (carry chemical messages to other neurons). Once neurotransmitters have been released into synapse, next step is for the postsynaptic neuron to collect them. Receptors are molecules embedded in the surface of the postsyn neuron that are special to bind with and respond to part kinds of neurotrans. The effect of a particular neurotransmitter depends on what its corresponding postsynaptic receptors do when activated. Some neurotrans (like glutamate) are excitatory which activates receptors that tend to increase likelihood of the postsyn neuron firing. Others like GABA are inhibitory and activates receptors that tend to decrease likelihood of the postsyn firing.

1.) Describe the goal of comparative anatomy.

Comparative anatomy - study of similiaries and differences between organism's brains; this provides a foundation for understanding how brain structure and function relate to learning and memory abilities. All vertebrate brains have two hemispheres and a recognizable cortex, cerebellum, & brainstem- species differ in relative volumes of these areas. In mammals (humans) and birds the cortex is much larger than cerebellum; in fish and amphibians (frog) the cortex and cerebellum are closer in size.

1.) Describe compound conditioning and the factors that influence it. Provide an example.

Compound conditioning - the simultaneous conditioning of two cues, usually presented at the same time (2 children shouting for their mother's attention, hard for the mom to pay attention to both kids with two competing cues). Factors that influence are overshadowing (when a more loud/seen cue within a compound acquires more associative strength and more strongly conditioned than the less loud/seen cue). For example, the loud kid getting more attention from the mother though the quieter one even though they are talking at the same time.

1.) Describe how conditioned compensatory response develops (particularly its application to drug tolerance).

Conditioned compensatory responses by demonstrated by 2 of Pavlov's colleagues: they injected dogs on several occasions with adrenaline (or epinephrine) - a chemical normally produced by the adrenal glands in response to stress or anxiety. The usual effect is an increase in heart rate. However, the dog's heart rate increased less and less with each injection. Such a decrease in reaction to a drug (so that larger doses are required to achieve the original effect) is called tolerance. The various cues of what was happening when the drug was given to the dogs (the stand, the injection etc) triggered a conditioned compensatory response that lowered the heart rate in the anticipation of the adrenaline causing an increase in heart rate. Such automatic compensatory responses occur primarily in body systems that have a mechanism for homeostasis, the tendency of the body (including the brain) to gravitate toward a state of equilibrium or balance.

1.) Identify the components of the learned association (stimulus, response, outcome).

Discriminative stimulus - stimuli that signal whether a particular response will lead to a particular outcome; helps the learner discriminate/distinguish the conditions where a response will be followed by a particular outcome. It is the first part of the chain that triggers the response and leads to the outcome. Response - the organism learns to make a specific response r that produces a particular outcome. A response is defined not by a particular pattern of motor actions however but instead by the outcome it produces; training ~ operant conditioning technique where organisms are gradually trained to execute complicated sequences of discrete responses. Outcome - reinforcers, primary reinforcers, drive reduction theory, secondary reinforcer, punisher

1.) Describe how drugs change neuronal functioning.

Drugs are chemical substances that alter the biochemical functioning of the body. Drugs that work on the brain generally do so by altering synaptic transmission. Drugs can increase or decrease the ability of the presynaptic neuron to produce or release neurotransmitter. Drugs can increase or decrease the ability of postsynaptic receptors to receive the chemical message. Drugs can alter the mechanisms for clearing neurotransmitter molecules out of the synapse. The effects of drugs on behavior depend on which neurotransmitters are involved and whether their ability to carry messages across the synapse is enhanced or impaired.

1.) Describe the concept of the engram and the theory of equipotentiality, and the experimental results addressing each of them.

Engram - a physical change in the brain that forms the basis of a memory (also referred to as a memory trace). In looking for the location of the engram, Karl Lashley would train a group of rats to navigate a maze and then he would systemically remove a different small area of the cortex in each rate. He reasoned that once he found the lesion that erased the animals memories of how to run through the maze then we would have located the site of the engram. The results from this experiment were not as straight forward. No matter what small part of the cortex he lesioned, the rats continued to perform the task. Bigger lesions would cause increasingly large disruptions in performance but no one cortical area seemed more important than another. He couldn't find the engrams for memories formed during maze learning. He then settled on the theory of equipotentiality - the theory that memories are stored globally, by the brain as a whole, rather than in just one particular brain area. Memories are spread over many cortical areas, damage to one or two of these areas wont completely destroy the memory & over time the surviving cortical areas may be able to compensate for what was lost. The truth here is somewhere in the middle btw brain areas just having certain specialties (not so extreme) and that engrams aren't located to specific areas of the cortex but the cortex is not as undifferentiated as he came to believe.

1.) Describe the processes of acquisition and extinction.

Extinction - process of reducing a learned response to a stimulus by ceasing to pair that stimulus with a reward or punishment; when the CS occurs repeatedly without the uncontrolled stimulus and the CR reduces. Acquisition - the initial stage when one links a neutral stimulus and an US so that the neutral stimulus begins triggering the CE. ( for example: imagine that you're teaching a pigeon to peck a key whenever you ring a bell. At first, you place some food on the key and sound a tone right before it pecks the key. After many times, it begins to peck the key when it hears the tone which means he has acquired the behavior. If you stop reinforcing the behavior, the bird would quickly stop engaging in the action leading to extinction.

1.) What does it mean when the brain is described as "plastic"?

Flexible, can change; accounts for learning and the brain's ability to compensate after trauma

1.) Explain the difference between free-operant and discrete trial procedures.

Free operant - where the animal can operate the experimental apparatus "freely", responding to obtain reinforcement (or avoid punishment) when it chooses Discrete Trials - where the experimenter controls the trials

1.) Identify the four major lobes of the cerebral cortex and describe the major functions of each.

Frontal Lobe - enables a person to plan and perform actions, higher emotions, decision making, metacognition, memory Parietal lobe - processing somatosensory (touch) information, attention, related to working and prospective memory Temporal lobe - language and auditory processing, learning new facts and forming new memories of events; medial temporal lobe = memory (encoding memory- not storing memory), interpreting and labeling visual images Occipital Lobe - visual processing, visual imagery, basic visual memory and visual working memory

Explain why generalization occurs by describing the generalization gradient and the concept of a consequential region.

Generalization Gradient - a graph showing how physical changes in stimuli (plotted on the horizonal axis) correspond to changes in behavioral responses (plotted on the vertical axis). Consequential Region - a set of stimuli in the world that share the same consequence as a stimulus whose consequence is already known

Define and describe the processes of generalization and discrimination.

Generalization is the transfer of past learning to new situations and problems; an organism responds to new stimuli that are similar to the original conditioned stimulus. Discrimination occurs when an organism learns a response to a specific stimulus, but does not respond the same way to new stimuli that are similar.

Define and describe the processes of habituation and sensitization.

Habituation - decrease in the strength or occurrence of a behavior after repeated exposure to the stimulus that produces that behavior; stimulus specific. ~ lack of response to originally noticeable stimuli ~ widespread basic form of learning ~ automatic reflexive Sensitization - a phenomenon in which a salient stimulus (like an electric shock) temporarily increases the strength of responses to other stimuli; experiences with an arousing stimulus leads to stronger responses to a later stimulus. Opposite of habituation; whereas habituation repeated experiences can reduce a rat's acoustic startle reflex (a defensive response like jumping or freezing to a startling stimulus like a loud noise), in sensitization repeated experiences can heighten it. Sensitization is not stimulus specific. New stimulus may otherwise evoke a weaker response. ~ humans can show sensitization of their startle reflexes. Skin conductance response SCR - change In skins electrical conductivity; response to emotion

1.) Describe Ebbinghaus's early experiments on forgetting. Notably, what did he find?

He measured forgetting by examining how long it took him to relearn a previously learned list. If it initially took him 10 minutes to learn the list and later took only 6 minutes to relearn the same list, Ebbinghaus recorded a "time savings" of 4 minutes, or 40% of the original learning time. By testing himself at various intervals after learning, Ebbinghaus was able to plot a retention curve, which measures how much information is retained at each point in time following learning. The retention curve also illustrates that most forgetting occurs early on: if a memory can survive the first few hours after learning, there is little additional forgetting. Shorter lists are also easier to remember than longer lists. He also demonstrated that increasing the amount of initial practice improved later recall. Despite the imperfections in his experimental designs (ie: conducted only with 1 participant [ himself ], he would knew which variables were being manipulated, etc), he led the way in the use of scientific experimentation to study learning and memory. There are few studies of human memory conducted today that don't owe their methodology to the early and influential work of Hermann Ebbinghaus.

1.) Explain how Thorndike "discovered" operant conditioning.

He was studying how animals learn the relationship or connection between a stimulus and behavioral outcome that leads to a desirable outcome. He studied how cats learn to escape from puzzle boxes/cages secured with complex locking/unlocking devices.

1.) Identify factors that influence the rate and duration of habituation.

How startling the stimulus is, the number of times it is experienced, and the length of time between repeated exposures. The less arousing an event is the more rapidly a response to that event will habituate. More rapid repetition of a stimulus generally leads to more rapid habituation.

1.) Describe the role of timing in acquiring operant responses.

Immediate outcome produces fast learning; with long delays, other behaviors may have taken place during the interval between the response R and outcome O so those behaviors might be associated with the outcome; temporal space is an important factor in self control - improve self control with a precommitment (like signing up ahead of time for yoga).

1.) Describe the role of inhibitory feedback in conditioning.

In a well-trained animal, the production of a CR, through activation of the interpositus nucleus, will in turn inhibit the inferior olive from sending US info to the Purkinje cells in the cerebellar cortex. This means that activity in the inferior olive will reflect the actual US minus (due to inhibition) the unexpected US, where the expected US is measured by the interpositus activity that drives the CR.

1.) Describe unconscious bias. What type of exposure-based learning is this?

In general, someone will be better able to distinguish between individuals belonging to racial groups you have encountered a lot in your life rather than groups you don't interact with much. This type of exposure-based learning is perceptual learning.

1.) Describe the role of the hippocampus in spatial learning.

In humans and other primates, it is a relatively small structure lying just beneath the temporal lobe. In rodents - the hippocampus makes up a much larger proportion of the brain; a bird's hippocampus is smaller than a rodent's, but it is important for spatial memory. Specifically bird species that store their food in many different locations for use in the winder have a hippocampus that is bigger than in related bird species do not need to keep track of hidden food.

6) What was so revolutionary about the Cognitive Revolution?

It is not so easily explained as behaviorism because it focuses on things we cannot always see, make sense of or observe like in behaviorism. The cognitive revolution bridged the gap between the physical world and the world of ideas, concepts, meanings, and intentions. It unified the two worlds with a theory that mental life can be explained in terms of information, computation, and feedback.

1.) Explain William James's concept of associationism.

James believed that most abilities and habits were similarly formed by our experiences, especially early in life. Concept of associationism ~ the act of remembering an event, such as a dinner party, would involve multiple connections between the components of the evening. These might include memories for taste of food, the feel of his stiff dinner jacket, the smell of perfume from the lady next to him. Activation of the memory for the dinner party, with all its components, could activate the memory for a second event that shared some related elements (visit to a dance hall with some perfume lady). The second event would be composed of its own parts but the 2 events (dinner party and dancing) would be associated by a linkage between their common or related components (sight of the lady and smell of her perfume). James took this model literally and believed that associations it described would eventually be mapped directly onto physical connections in the brain. Today most modern theorists draw on his idea of learning as a process of forming associations between the elements of an experience

1.) Describe the processes of LTP and LTD and how they relate to learning.

LTP - process in which synaptic transmission becomes more effective as a result of recent activity; believed to represent a form of synaptic plasticity that could be the neural mechanism for learning. Perhaps postsyn neurons change to become more responsive to subsequent inputs. This would mean that when presyn neurons release neurotrans after strong stimulation, the postsyn neurons will have a heightened sensitivity to that neurotrans and enhance the response. LTD - process by which synaptic transmission becomes less effective as a result of recent activity; when synaptic transmission becomes less effective; believed to represent a form of synaptic plasticity that could be the neural mechanism for learning. Basically opposite of LTP. One situation of this happening is if presyn neurons are repeatedly active but the postssyn neurons do not respond. Neurons that fire together wire together, but connections between neurons that don't fire together weaken, a change that is believed to reflect a weakening in synapses.

1.) Describe how dopamine circuits in the brain are involved in reinforcement.

Loss of dopamine doesn't mean individuals can't experience liking or pleasure to something - what seems to change is their willingness to work for it (connected to reinforcement). the incentive salience hypothesis of dopamine function states that the role of dopamine in operant conditioning is to signal how much the animal "wants" a particular outcome—how motivated it is to work for it. according to this hypothesis, the incentive salience of food and other reinforcers—their ability to attract attention and motivate responding—is reduced in dopamine-depleted animals

1.) Explain the Matching Law and how it applies to choice behavior.

Matching law of choice behavior - the principle that an organism, given a choice between multiple responses, will make a particular response at a rate proportional to how often that response is reinforced relative to the other choices.

1.) Describe the structures involved in processing sensory information and driving motor responses.

Most sensory inputs the brain through the thalamus. The thalamus then distributes these inputs to cortical regions specialized for processing particular sensory stimuli (sound, skin, internal organs, visual). Chief of these brain regions that specialize in processing outputs for control movements is the primary motor cortex (M1) which generates coordinated movement. Located in the frontal lobe, sends output to the brainstem which sends instructions down the spinal cord to activate motor fibers that control the muscles. Primary auditory cortex (A1), for sound; the primary soma sensory cortex (S1) for sensations from skin and internal organs; and the primary visual cortex (V1) for sight. A1 is located in the temporal lobe, S1 in the parietal lobe, V1 in occipital lobe; primary motor cortex (m1) generates coordinated movements. M1 located in frontal lobe

1.) What is the difference between pathological and behavioral addictions?

Pathological addiction - a strong habit that is maintained despite harmful consequences Behavioral addiction - addiction to a behavior that produces reinforcement, as well as cravings and withdrawal symptoms when the behavior is prevented

1.) Define and differentiate the processes of positive reinforcement, positive punishment, negative punishment, and negative reinforcement. Provide an example of each.

Positive reinforcement - type of OC in which the response causes a reinforcer to be "added" to the environment; over time the response becomes more frequent (IE: giving a child a treat when they are good at school). Positive punishment - a type OC where the response causes a punisher to be 'added" to the environment; over time the response becomes less frequent. (IE: a child is chewing gum in class which is against the rules so the teacher disciplines them in front of the class and the child stops chewing gum in class). Negative punishment - type of OC where the response causes a reinforcer to be taken away or subtracted from the environment and over time the response becomes less frequent. (IE: losing access to a toy, being grounded). Negative reinforcement - a type of OC where the response causes a punisher to be taken away/subtracted from the environment and over time the response becomes more frequent. (IE: you decide to clean up the mess in your kitchen to avoid getting in a fight with your partner).

1.) What are the differences between primary and secondary reinforcers?

Primary - a stimulus like food, water, sex, sleep that has innate biological value to the organism and can function as a reinforcer Secondary - a stimulus like money that has no intrinsic bio value but that has been paired with primary reinforcers or that provides access to primary reinforcers

1.) Identify factors that influence the effectiveness of punishment.

Punishment leads to more variable behavior. ~ according to the law of effect, reinforcement of a particular response r increases the probability that the same response r will occur in the future. in contrast, punishment of r decreases the probability that r will occur in the future. But this does not tell us what response will occur instead of r. in fact, punishment tends to produce variation in behavior, as the organism explores other possible responses. that's okay if the primary goal is simply to eliminate an undesired response (such as training a child not to go near a hot stove). But it's not a particularly good way to train desired behaviors. if the goal of conditioning is to shape behavior in a predetermined way, then reinforcing the desired response generally produces much faster learning than simply punishing alternate, undesired responses. Discriminative stimuli for punishment can encourage cheating. ~ remember how discriminative stimuli can signal to an organism whether an operant response will be reinforced? discriminative stimuli can also signal whether a response will be punished. For a speeding driver, the sight of a police car is a discriminative stimulus for punishment: speeding in the presence of this stimulus will probably be punished. But speeding in the absence of a police car will probably not be punished. in this case, punishment doesn't train the driver not to speed—it only teaches him to suppress speeding in the presence of police cars. When no police car is visible, speeding may resume. similarly, the dominant male in a group of chimpanzees may punish females for mating with any other males—but when his back is turned, the females often sneak off into the bushes with lower-ranking males. and rats that have been trained to eat no more than four pellets at a time will happily eat all the food in sight if no human is watching Concurrent reinforcement can undermine the punishment. ~ the effects of punishment can be counteracted if reinforcement occurs along with the punishment. suppose a rat first learns to press a lever for food but later learns that lever presses are punished by shock. Unless the rat has another way to obtain food, it is likely to keep pressing the lever to obtain food reinforcement, in spite of the punishing effects of shock. similarly, a child who is reprimanded for talking in class will suppress this behavior much less if the behavior is simultaneously reinforced by approval from classmates. and although a speeding driver risks a hefty ticket, the effects of this punisher may be counteracted by the reinforcing fun of driving fast. Initial intensity matters. ~ Punishment is most effective if a strong punisher is used from the outset. in one study, rats received a shock as they ran through a maze to the goal box (Brown, 1969). the shock was initially delivered at the lowest intensities (1 or 2 volts) and had little effect on behavior. Gradually, across several days, shock intensity increased to 40 volts. Behavior was essentially unaffected, even though naive rats given a 40-volt shock would stop running immediately. apparently, early weak shocks made rats insensitive to later, stronger ones. the effectiveness of the strong shock was completely undermined by starting weak and working up from there.

1.) What is latent inhibition? How do attentional approaches to stimulus selection explain latent inhibition?

Refers to the observation that a familiar stimulus takes longer to acquire meaning than a new stimulus.

6) Describe Mary Calkins's contributions to early research on memory.

Researched associative learning with paired-associate learning (ie: the first steps in learning a foreign language). Participants studied cue-target pairs Cue: item that will be used to jog memory later; target: item that should be remembered. She found a greater overlap in cue-target pairs would result in better memory (like ocean-lake is easier to remember than toaster-desk). Prior familiarity with cue-target pairs also resulted in better memory (ie: learning English-french vs english-croatian word pairs). She discovered the recency effect in working memory (observation that memory is usually superior for items at the end of a list).

1.) Describe the relationship between synaptic plasticity and learning.

SP - ability of synapses to change as a result of experience. Ability to reorganize and adapt in response to changing environmental demands. Connections btw neurons change during learning.

1.) Describe the organization of the visual, auditory, and somatosensory cortices.

Sensory cortices are areas of the cerebral cortex that process visual stimuli, auditory stimuli, somatosensory (touch) stimuli, etc. within each of these brain regions individual neurons respond to different stimulus features; the range of stimuli that cause a particular cortical neuron to fire is called the neuron's receptive field. The spatial organization (body map) of somatosensory cortex reflects the fact that neurons with similar receptive fields are often found clustered together in sensory cortices.

1.) Describe the procedures of shaping and chaining. Provide an example of each.

Shaping - where successive approximations to the desired response are reinforced. (IE: Annie's parents use a similar shaping procedure when first introducing the potty seat. When they think Annie might be ready to use the potty seat, they put her on it; if she does use the potty seat successfully, they reinforce this behavior with praise. Gradually, through a series of progressive approximations, Annie learns to approach the potty and perform the response on her own). Chaining - where organisms are gradually trained to execute complicated sequences of discrete responses. (IE: skinner once trained a rat to pull a string that released a marble, then to pick up the marble with its forepaws, carry it over to a tube, and drop the marble inside the tube. skinner couldn't have trained such a complex sequence of responses all at once. instead, he added "links" to the chain of learned responses one at a time: he first trained the rat to pull the string, then trained it to pull the string and pick up the marble, and so on.)

1.) What are the similarities and differences between classical and instrumental conditioning? Provide a unique example of each.

Similarities - processes that lead to learning; accelerated learning curve Differences - classical (outcome occurs regardless) whether the response is performed Operant - outcome only occurs if response is performed

1.) Explain negative contrast.

Situation where an organism will respond less strongly to a less preferred reinforcer that is provided in place of an expected preferred reinforcer than it would have if the less pref reinforcer had been provided all along

1.) Describe the goal of structural neuroimaging techniques.

Techniques (such as MRI) for creating images of anatomical structures within the living brain. 'brain imaging' or 'brain scanning.' Shows the size and shape of the brain areas and also brain lesions (areas of damage caused by injury or illness). Structural neuroimaging provides a way not only to directly observe physical properties of a live person's brain but also to track changes in those properties. MRI- magnetic resonance imaging; method of structural neuroimaging based on recording changes in magnetic fields DTI - diffusion tensor imaging; type of MRI that measures the diffusion of water in brain tissue, permitting bundles of axons throughout the brain to be imaged. CT - computed tomography, produces scans created from multiple x rays images. By looking at multiple slices doctors can pinpoint the exact location of internal anatomical structures in 3 dimensional space. A CT can show location of an abnormality like a tumor with better accuracy than a single x ray.

1.) Describe how brain regions function and contribute to learning and memory.

The amygdala is involved in fear and fear memories. The hippocampus is associated with declarative and episodic memory & recognition memory. Cerebellum plays a role in processing procedural memories (like how to play the piano). The prefrontal cortex is involved in remembering semantic tasks Declarative memory - long term memory that stores facts and events (can be called conscious or explicit memory). Encoding - process of converting information into a construct that can be stored in the brain. Consolidation - act or process of turning short term mems into more permanent long term mems

1.) Describe the role of the cerebellum in conditioned responding.

The cerebellum & its associated circuitry constitutes the entire essential neuronal system for CC of eye-blink and other discrete responses (limb flexion) learned with an aversive unconditioned stimulus (US)...

1.) Identify key components of the Rescorla-Wagner model and apply the model to classical conditioning phenomena (walk through an example).

The key idea is that changes in CS-US associations on a trial are driven by the discrepancy (or error) between the animal's expectation/prediction of the US and whether or not the US actually occurred. Sometimes referred to as the prediction error. Rescorla and Wagner proposed that there are three key situations to consider in interpreting a prediction error, assummarized in table 4.6, and they are very similar to the threeways herman learned from past errors to improve his tennis serve. one is a situation in which either no CS or a novel CS is presented followed by a US, so that the US will be unexpected; this is considered a positive prediction error because there is more US than expected. the Rescorla-Wagner theory expects that the CS US association should increase proportional to the degree that the US is surprising; that is, the larger the error, the greater the learning. this makes sense because if you failed to predict the US, you want to increase (move in a positive direction) your likelihood of predicting it in the future (given the same CS). It is similar to what herman does when his serve is too short and he positively increases the strength of his serve to send it farther next time. If, however, a well-trained CS is followed by the expected US, there is no error in prediction (the US was fully predicted by prior presentation of the CS), and thus no new learning is expected. this is similar to what happens when herman makes a perfect serve; he doesn't want to change a thing from what he did last time. Finally, if the CS predicts a US and the US does not occur, the prediction error is considered negative, and Rescorla and Wagner expect it to be followed by a decrease in the CS US association. this is similar to what happens when herman hits the ball too far and he has to reduce the strength of his serve next time. 3 assumptions: 1. Each CS has an associative weight - which is a value representing the strength of association between the cue & the US 2. The expectation of the US is based on the sum of ALL weights for ALL of the CS's present 3. Learning is proportional to the prediction error.

1.) What is the Premack Principle?

The theory that the opportunity to perform a highly frequent behavior can reinforcea less frequent behavior; later refined as the response deprivation hypothesis.

6) What are the key ideas from the Cognitive Revolution?

Use of the scientific method in cognitive science research, the necessity of mental systems to process sensory input, the innateness of these systems, and the modularity of the mind

1.) What is the difference between reinforcement and punishment?

With reinforcement, you are increasing the behavior; with punishment, you are decreasing a behavior. All reinforcers (negative or positive) increase the likelihood of a behavioral response. All punishers (positive or negative) decrease the likelihood of a behavioral response.

1.) Identify and define the basic terminology of classical conditioning. Provide a unique example.

a dog will naturally salivate when it sees or smells food. No learning is needed for it to make this response. For this reason, psychologists call the food an unconditioned stimulus, or US, meaning a stimulus that naturally—that is, without conditioning—evokes some response. an unconditioned stimulus, such as food, evokes a natural response, such as salivation, which psychologists call the unconditioned response, or UR; their relationship does not depend on learning. Similarly, Moira's craving for ice cream and dan's dismay at wet carpets in his house are natural—that is, unconditioned—responses to good and bad things in their lives. they both occur unconditionally without prior training. In contrast, a neutral stimulus, such as a bell that the dog has not heard before, evokes no such salivation by the dog. after Pavlov put his dogs into the apparatus shown in Figure 4.1, he repeatedly paired the bell with food: each time the bell was rung, an assistant promptly delivered food to the dog. this resulted in the formerly neutral stimulus, the bell, becoming a conditioned stimulus, or CS, as illustrated in Figure 4.2b. after repeated presentations of the bell CS and the food US, the two became linked in the dog's mind. this training—or conditioning, as Pavlov called it—resulted in the dog learning something new: the bell predicts the food. We can evaluate the degree to which the dog learned this prediction—that is, how strongly it expects the food when it hears the bell—by measuring how much the dog salivates to the bell alone. this is shown in Figure 4.2c, in which the bell alone, the conditioned stimulus (CS), now evokes an anticipatory response, called the conditioned response, or CR, even in the absence of the food.

1.) Describe the role played by operant learning in drug addiction.

addiction may involve not only seeking the "high" but also avoiding the adverse effects of withdrawal from the drug. in a sense, the high provides a positive reinforcement, and the avoidance of withdrawal symptoms provides a negative reinforcement—and both processes reinforce the drug-taking responses. Long- term drug use can also cause physiological changes in the synapse, so that ever- larger doses of drug are needed to get the same effect.

1.) What is the role of discriminative stimuli in operant learning?

discriminative stimuli are stimuli that signal whether a particular response will lead to a particular outcome. in other words, they help the learner discriminate or distinguish the conditions where a response will be followed by a particular outcome. This helps in OC because it distinguishes which behaviors will be reinforced or punished.

1.) Describe the dual-process theory of habituation and sensitization.

dual process theory - theory that habituation and sensitization are independent of each other but opposite in parallel. Suggests that repeated events always lead to the process underlying both sensitization and habituation. In DPT, both sensitization and habituation processes occur in response to every stimulus presentation and is the summed combination of these two independent processes that determines the strength of responding ~ the actual outcome (the strength of the response to S) depends on factors like how often S has been repeated and the intensity/recency of the highly arousing event. Also depends on whether other stimuli have activated the state system.

1.) Describe the cortical changes that occur as a result of mere exposure and training.

during development, the arrival of inputs from different sensory receptors determines how cortical neurons become tuned, as well as the proportion of available neurons that respond to a particular class of input. someone born without vision is likely to have proportionately more neurons available to respond to tactile stimuli, and someone born deaf typically will come to have larger cortical regions sensitive to visual stimuli. neuroimaging studies suggest that it is relatively easy to retune neurons within the sensory cortices of adults and that it can be done in less than a day. for example, simply touching a person's fingertip repeatedly with tiny pins was shown to improve the person's ability to distinguish subtle differences in the pins' positions. Initially, people were able to discriminate two simultaneous touches on the tip of their index finger as long as the touches were spaced at least 1.1 mm apart (figure 3.15a). after receiving 2 hours of exposure consisting of repeated simultaneous stimulation of two closely spaced points (0.25-3 mm apart) on the tip of their right index finger, participants' ability to discriminate touches improved. this study shows that humans can learn to make fine distinctions through mere repeated exposures. what's going on in the brain when this happens? Before repeated exposures, fmr 1 difference images showed that touching the right index finger resulted in localized activation within the somatosensory cortex (figure 3.15b). after this finger was stimulated repeatedly for 2 hours, subsequent instances of stimulation activated a larger region of the somatosensory cortex than was observed before exposure (figure 3.15c; hodzic et al., 2004). thus, repeated touching of the fingertip led to both perceptual learning and cortical reorganization. the increase in the size of the region of somatosensory cortex that was selectively activated during stimulation of the tip of the right index finger was likely associated with an increase in the number of cortical neurons tuned to touches of the fingertip. many neuroscientists now believe that all forms of perceptual learning in mammals depend on cortical plasticity *larger region of the cortex was being activated after mere exposure*

1.) Describe the different schedules of reinforcement and their effects on responding. Provide an example of each.

fxed-ratio schedule - a reinforcement schedule where a specific number of responses are required before a reinforcer in delivered; for example FR5 means that reinforcement arrives after every 5th response (IE: Delivering food pellets to a rat after it presses a bar five times). Variable-ratio - a reinforcement schedule where a certain number of responses are required before a reinforcer is delivered, VR5 means that on average every 5th response is reinforced (Gambling/lottery games; a response is reinforced after an unpredictable level of responses). Fixed interval - a reinforcement schedule where the first response after a fixed amount of time is reinforced; thus F1 1-m means that reinforcement arrives for the 1st response made after a one minute interval since the last reinforcement. (IE: a weekly paycheck, dental exams). Variable Interval - a reinforcement schedule where the first response after a fixed amount of time on average is reinforced; thus V1 1-m means that the first response after one minute on average is reinforced (IE: checking your email, having a random health instructor stop by at work, your boss contacting you, not knowing when its coming). concurrent reinforcement schedule - A reinforcement schedule in which the organism can make any of several possible responses, each of which may lead to a different outcome reinforced according to a different reinforcement schedule.

1.) Describe the role of the basal ganglia (i.e., dorsal striatum) in operant learning

information from the sensory cortex to the motor cortex can also travel via an indirect route, through the basal ganglia (colored purple in Figure 5.8). the basal ganglia are a collection of ganglia (clusters of neurons) that lie at the base of the forebrain. one part of the basal ganglia is the dorsal striatum (Figure 5.8), which can be further subdivided into the caudate nucleus and the putamen. the dorsal striatum receives highly processed stimulus information from sensory cortical areas and projects to the motor cortex, which produces a behavioral response. the dorsal striatum plays a critical role in operant conditioning, particularly if discriminative stimuli are involved. rats with lesions of the dorsal striatum can learn operant responses (e.g., when placed in a skinner box, lever-press r to obtain food o). But if discriminative stimuli are added (e.g., lever-press r is reinforced only in the presence of a light sd), then the lesioned rats are markedly impaired. in humans, too, individuals with damage or disruption to the striatum due to Parkinson's disease or huntington's disease show deficits in the ability to associate a discriminative stimulus with a correct response. In short the dorsal striatum appears necessary for learning. s à r associations that depend on the dorsal stria- tum tend to be relatively automatic or habitual. remember the well-trained rats, discussed earlier in this chapter, who would run right through a pile of food on their way to a goal box in t he maze? that behavior probably reflects s à r learning in the striatum, making the maze-running automatic even when other behaviors (such as pausing to eat) would have resulted in reward. in this case, running is based on a history of learning in which that response resulted in desirable outcomes; but after a long period of training, the response is performed even though the outcome is no longer contingent on that action.

explain how the VTA is involved in reinforcement

later studies identified that rats would work for electrical stimulation in several brain areas, including the ventral tegmental area (VTA), a small region in the midbrain of rats, humans, and other mammals. the electrodes in olds's original studies were probably stimulating hypothalamic neurons that project to the Vta, so that the electrical current was indirectly activating this area. Because Vta stimulation was such a powerful reinforcer, some researchers inferred that the rats "liked" the stimulation, and the Vta and other areas of the brain where electrical stimulation was effective became informally known as "pleasure centers The VTA (part of the midbrain) contains dopamine-producing neurons which project to the frontal cortex and other brain areas.

1.) Describe the different types of exposure-based learning.

much of what is known about the learning and memory processes that occur when organisms inspect their surroundings comes from studies of object recognition and spatial navigation. exposure to objects and places often does not initially lead to obvious changes in behavior; latent learning is the norm. appropriately designed tests can, however, reveal the short- and long-term effects of repeated exposures. ~ Novel Object Recognition - organism's detection of and response to unfamiliar objects during exploratory behavior. ~ priming - prior exposure to a stimulus can lead to a sense of familiarity the next time that stimulus is observed. even when it does not lead to a sense of familiarity, it can affect the individual's response to a repeated stimulus (or related stimuli); when prior exposure to a stimulus can improve the ability to recognize the stimulus later. ~ Perceptual learning - learning in which experience with a set of stimuli makes it easier to distinguish those stimuli (IE: sorting chickens based on sex. They can't always verbalize or explain how they know, but they just "know." ~ Spatial Learning - the learning of info about one's surroundings; learning where things are in relation to something else

1.) Describe the role of the orbitofrontal cortex in operant learning.

several brain areas appear to be involved in learning to predict the outcomes of behavior. among these are parts of the prefrontal cortex, including the orbitofrontal cortex, which lies at the underside of the front of the brain in primates (Figure 5.8), and which appears to contribute to goal-directed behavior by representing predicted outcomes. the orbitofrontal cortex receives inputs conveying the full range of sensory modalities (sight, touch, sound, etc.) and also visceral sensations (including hunger and thirst), allowing this brain area to integrate many types of information; outputs from the orbitofrontal cortex travel to the striatum, where they can help determine which motor responses are executed. evidence that the orbitofrontal cortex plays a role in predicting the outcome of responses comes from neuronal recordings.

1.) Explain how the hippocampus can contribute to conditioning.

the hippocampus is not necessary for learning new conditioned responses. For example, animals or humans with hippocampal damage are able to learn a basic conditioned eyeblink response quite normally. Nevertheless, electrophysiological recordings of animals show that the hippocampus is very active during conditioning, especially early in training. One clue to what its role is by looking at more complex conditioning paradigms like latent inhibition. If the hippocampus is needed for CS modulation effects in classical conditioning, then an animal without a hippocampus should not exhibit CS modulation effects such as latent inhibition. In fact, this is exactly what researchers have found: removing the hippocampus (and associated cortical input regions) eliminates the latent inhibition effect in classical conditioning of the rabbit eyeblink reflex. If the hippocampus is necessary for latent inhibition and other forms of CS modulation, we might infer that the hippocampus plays a role in determining how sensory cues are processed before they are used by the cerebellum to form long-term memory traces.

1.) Compare and contrast the empiricist and nativist schools of thought.

· Empiricist - believe that we are all born equal, as blank slates, to be shaped by our experiences. (John Locke, Aristotle, James) nurture · Nativist - we are shaped primarily by our inherited nature, viewed the body as a machine that works through mechanical/hydraulic principles; mind was a separate entity from the body. (Rene Descartes) nature *modern researchers are less likely to be strict nativists or empiricists and more likely to accept that both nature (genes) and nurture (experience) play a role in human learning and memory.

1.) ten tips on improving one's memory.

· Pay attention. If you pay full attention to what you are trying to learn, you'll be more likely to remember it later. · Create associations. Associate what you're trying to learn with other information you already know. · A picture is worth a thousand words. Names/dates/etc are more memorable if you can link them to an image. · Practice makes perfect. Memories for facts are strengthened by repetition. They are improved by practice. · Use multiple senses. Instead of just reading info silently, read it out loud. You can also write it out - writing activates sensory systems and forces you to think about the word you're copying. · Reduce overload. Use memory aids (post it notes, calendars, planners to remember appts, due dates, etc) freeing you to focus on remembering things that must be called to mind without those memory aids like taking an exam. · Time travel. Remembering info for facts doesn't depend on remembering the exact time/place where you first acquired it. If you can't remember a fact, try remembering where you first heard it. · Get enough sleep. Sleep is important for helping the brain organize and store memories. · Try a rhyme. Create a poem or song that includes the information. · Relax. Sometimes trying hard to remember is less effective than turning your attention to something else; often the missing info will pop into your awareness later.


Ensembles d'études connexes

Consumer Behavior Exam II Chapter 9

View Set