Classical conditioning-Chapter 4-When does learning occur?
Informational Account of learning-Rescorla
-/=given -p=probability -p(US/CS) = p(US/no CS)=No information = no learning -p(US/CS) > p(US/no CS)=The CS signals the US = excitatory learning -p(US/CS) < p(US/no CS)=The CS signals safety = inhibitory learning
Acquisition
-CSA - US; ⍺ A = .45; β = .45; = 1; VT = VA -learning stops when VA =λ(1)
Rescorla- Truly Random control procedure
-If contiguity is sufficient for learning to occur, the US(random) should form a CS-US association just as well as the US (info) group -no learning occurs in the US group
Leon Kamin's element of Surprise
-Kamin argued that only learn when you are surprised when something unexpected happens -examples: trip home-only remember what was unexpected
Attentional Account of Learning
-Mackintosh and Sutherland -Basic Assumptions: 1.Organisms have a limited attentional capacity -we can only fully attend one thing at a time 2.To learn about a stimulus, you must attend it -Why does blocking occur?: -in phase 1, you train the animal to attend CS -then the animal fails to attend to CS and therefore does not learn about the CS
Assumption of Rescorla-Wagner model: The degree to which you expect a US, depends on the conditioning value of the CSs that occur before it
-V=associative strength -exs: VA = CSA associative strength -VA = how much do we expect the US when CSA is present
Assumption of Rescorla-Wagner model: may (and usually are) more than on CS present at any given time. The degree to which you expect the US, depends on the net expectation from the entire set of CSs
-VT = VA + VB + VC....+ VN -VT = net expectation
Rescorla-Wagner model can explain...
-acquisition -blocking -overshadowing -extinction -conditioned inhibition -US Pre-exposure effect -also makes novel predictions: -protection from extinction -overexpectation
Rescorla-Wagner model
-mathematically described Kamin's notion of surprise -Assumption: -the US can support a fixed amount of expectation
equation of Rescorla - Wagner model
-ΔV = ⍺ Aβ(λ- VT) -The change in associative strength to CSA depends on stimulus salience, learning rate parameters, and the degree to which you are surprised
Other components of Rescorla - Wagner Model
-⍺ = salience -ex. ⍺ A = salience of CSA -β = learning rate parameter (some stimuli are more readily associated than others) -Δ = change
Two violations of "Contiguity is Sufficient"
1.Overshadowing 2.Blocking
λ - VT...
=degree of surprise -λ represents what you actually get -VT represents what you expected to get -Examples: -Suppose VT = 0 (expect nothing), but you get something= big surprise -Suppose VT = λ = no surprise
λ (lamda)
asymptote of conditioning the US is capable of supporting
Informational account
can explain contingency and inhibition
Attentional account
can explain overshadowing and blocking
In overshadowing and blocking...
contiguity is present between CS and US but no learning occurs
Need a unified theory to integrate...
the attentional and informational accounts