Appendix B4 - Joint and conditional distribution
Var(a(X)Y + b(X)|X) =
Given X we can say that a(X) and b(X) are essentially constants So we are essentially looking at Var(aY + b) = a²Var(Y) So Var(a(X)Y + b(X)|X) = a(X)²Var(Y|X)
Var(X + Y) when X and Y are independent
Var(X) + Var(Y)
Var(X - Y) when X and Y are independent
Var(X) + Var(Y)
If X and Y are independent then Var(Y|x) =
Var(Y)
How does one capture this conditional expectation relationship in metrics?
We say E(Y|X = x) is a function of x so m(x) = E(Y|X = x) Usually we say m(x) is a linear function so E(Y|x) = a + bx Similarly we can insert random variable X into the function E(Y|X) = m(X)
CE.2 (property 2): E[a(X)Y + b(X)|X] =
a(X)E(Y|X) + b(X) Given X, a(X) and b(X) are treated like constant
State the Cauchy-Schwartz inequality
|Cov(X,Y)| ≤ sd(X)sd(Y)
Corr(a₁X + b₁ ,a₂Y + b₂) where a₁a₂ < 0
-Corr(X,Y)
What magitude can Corr(X,Y) take?
-1 ≤ Corr(X,Y) ≤ 1
Var(Y|X = x) =
Alternatively: E(Y²|x) - [E(Y|x)]² We are only looking at the variance of the Y outcomes when X = x
CE.1 (property 1): E[c(X)|X) =
C(X) This essentially means the function of X, c(X) behaves as a constant when we compute expectations conditional on X
Corr(a₁X + b₁ ,a₂Y + b₂) where a₁a₂ > 0
Corr(X,Y)
If Cov(X,Y) = 0 then what can you deduce about Corr(X,Y)?
Corr(X,Y) = 0 Cov(X,Y) = 0 ↔ Corr(X,Y) = 0
If X and Y are independent what can you deduce about Corr(X,Y)?
Corr(X,Y) = 0 Note: X,Y independent → Corr(X,Y) = 0 But the following is NOT true Corr(X,Y) = 0 → X,Y independent It means there is no linear relationship between X and Y and they are said to be uncorellated random variables
What is the correlations between X and Y?
Corr(X,Y) = Cov(X,Y)/ sd(X)sd(Y)
If Cov(X,Y) > 0 then what can you deduce about Corr(X,Y)?
Corr(X,Y) > 0
What can you deduce if X and Y are independent?
Cov(X,Y) = 0 Note: X,Y independent → Cov(X,Y) = 0 But the following is NOT true Cov(X,Y) = 0 → X,Y independent
What is the covariance between two random variables X and Y?
Cov(X,Y) = E[X - E(X)][Y- E(Y)] Cov(X,Y) = E(XY) - E(X)E(Y) Measured the amount of linear dependence betweeen two random variables. If postive it means the varaibles move in the same direction; if negative it means variables move in opposite direction.
CE.3 (property 3): If X and Y are independent E(Y|X) =
E(Y) When the X and Y are independent the expected value of Y given X does not depend on X
CE.4 (property 4): E[E(Y|X)] =
E(Y) - this is known as law of iterated expectation. E[E(Y|X)] means let X = 1 or X = 0 find averages in these groups so E(Y|X = 1) and E(Y|X= 0) and then find averages of these values so E[E(Y|X)] = E(Y|X = 1)P(X = 1) + E(Y|X = 0)P(X = 0) and you will find this equals E(Y)
What is the conditonal expectation?
E(Y|X = x) or E(Y|x) for shorthand Meaning the average Y given X = x
CE.5 (property 5): If E(Y|X) = E(Y) what can you deduce about Cov(X,Y) and hence Corr(X,Y)?
E(Y|X) = E(Y) this means they are independent and hence Cov(X,Y) = Corr(X,Y) = 0 This also means: Cov(X,Y) ≠ 0 → E(Y|X) ≠ E(Y) But note the converse is not true: Cov(X,Y) = 0 → E(Y|X) =E(Y)
Consider variables X, Y, Z How can one find E(Y|X) in two steps?
E[E(Y|X,Z)|X] First find E(Y|X,Z) The find the expectated value of E(Y|X,Z) conditonal on X
If Corr(X,Y) = -1 what can you deduce?
Implies a perfect postive linear relationship between X and Y Can write Y = aX + b where b < 0
If Corr(X,Y) = 1 what can you deduce?
Implies a perfect postive linear relationship between X and Y Can write Y = aX + b where b > 0
How does one define conditional expectation?
Integral of the conditional density function.
Var(aX + bY)
a²Var(X) + b²Var(Y) + 2abCov(X,Y) If X and Y are independent so Cov(X,Y) = 0 Then Var(aX + bY) = a²Var(X) + b²Var(Y)
Cov(a₁X + b₁ ,a₂Y + b₂)
a₁a₂Cov(X,Y)
If {X₁, ... , Xₙ} are pairwise uncorrelated then Var(a₁X₁ + .... + aₙXₙ) =
a₁²Var(X₁) + ... + aₙ²Var(Xₙ) Summation notation Var(∑ⁿ aiXi) = ∑ai²Var(Xi)