Lecture 6 - Multiple Linear regression

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

5) Including Control variables: Are more controls better than less? all estimates are biased if one parameter is ..........

endogenous (correlated with error term)

6) Multicollinearity: #creating additional control variables: when we add more variables, the variance ......, R^2 ......

rises, R^2 rises

3) Omitted variable bias: define the bias of each 1) β2 > 0, corr(x1,x2) < 0 = 2) β2 > 0, corr(x1,x2) > 0 = 3) β2 < 0, corr(x1,x2) > 0 = 4) β2 < 0, corr(x1,x2) < 0 =

1) β2 > 0, corr(x1,x2) < 0 = Negative Bias 2) β2 > 0, corr(x1,x2) > 0 = Positive Bias 3) β2 < 0, corr(x1,x2) > 0 = Negative Bias 4) β2 < 0, corr(x1,x2) < 0 = Positive Bias

7) Properties of OLS residuals: what are the 2 main properties?

1. Sum of residuals = 0 ∑u^i = 0 2. Sum of residuals x Explanatory variables = 0 ∑u^j xji = 0

1) Zero conditional mean assumption: What is this?

E(U|x1,x2,.....xk) = 0 The expected value of the error term conditioned on the explanatory variables is equal to zero

3) Omitted variable bias: we can also model the relationship between explanatory variables as x2 = δ~0 + δ^1x1 then β~1 = β~1 = β^ + β^ δ~1 β^1 is a biased estimator show this

E[β~1] = E[β^1+β^2δ^1] E[β^1] + E[ββ2]δ~1 = β1 + β2δ~1 therefore bias (β~1) = E[β^1] - β1 = β2δ^1

6) Multicollinearity: sampling variance of OLS slope estimates: var(β^1) = σ^2 / SSTj (1-R^2j) what is R^2j?

R^2j = R squared from a regression of explanatory variable j on all other explanatory variables

8) goodness of fit: R^2 = ...........

R^2 = (SSE/SST) = 1 - (SSR/SST) The fraction of the sample variation in y that is explained by all the explanatory variables (R^2) is given by the Sum of Squares Error, divided by the Total Sum of Squares. This is equal to 1 minus the Residual sum of squares divided by the total sum of squares.

6) Multicollinearity: sampling variance of OLS slope estimates: If not explaining any of the explanatory variables

R^2 = 0

6) Multicollinearity: sampling variance of OLS slope estimates: if perfectly explained then R^2 = ...

R^2 = 1

8) goodness of fit: when we add an extra explanatory variable, what happens to R^2?

R^2 increases

8) goodness of fit: define R^2

R^2 is the fraction of the sample variation in y that is explained by all the explanatory variables

8) goodness of fit: what is the equation for the adjusted R^2?

R^2^ = 1 - ((SSR/(n-k-1))/(SST/(n-1))

6) Multicollinearity: sampling variance of OLS slope estimates: var(β^1) = σ^2 / SSTj (1-R^2j) what is SSTj?

SSTj = total sum of squares of the explanatory variables j

3) Omitted variable bias - example: we are studying the effect of household disposable income on amount the household donates to charitable organizations. Assume Gauss Markov assumptions hold. donations = γ0 + γ1hh_inc + γTV + u hh_inc = household income TV = hours of TV watched per week we estimate instead donations = α0 + α1hh_inc + u for which condition will the OLS estimator α^1 be biased? a) Watching more TV is associated with lower incomes but increases donations b) The amount of TV is unrelated to income but increases donations c) Household income and households donations are not related to the amount of TV watched d) income has a positive impact on how much a household donates but has no correlation with TV Explain which one

a) Watching more TV is associated with lower incomes but increases donations corr(x1,x2) < 0 and β^2 > 0 (negative bias)

5) Including Control variables: Why does including irrelevant information make our estimates more unbiased?

because our beta value will be 0 throws noise into the term

3) Omitted variable bias: If experience does effect on wages and is correlated with education term then the estimator will be ............

biased

1) Zero conditional mean assumption: the ZCMA is the key assumption for casual inference and is easily violated if ..................

if there are omitted variables

6) Multicollinearity: #creating additional control variables: when we add more variables, what happens to the standard error?

increases

6) Multicollinearity: #creating additional control variables: when we have a large standard error we get values further away from the ......

mean

7) Properties of OLS residuals: 1. Sum of residuals = 0 ∑u^i = 0 2. Sum of residuals x Explanatory variables = 0 ∑u^j xji = 0 we arrive at these with what proof?

method of moments proof

6) Multicollinearity: #creating additional control variables: when we have a large standard error we get values further away from the mean what causes this problem?

multicollinearity

8) goodness of fit: what is Adjusted R^2?

penalizes the extra explanatory variable and allows us to make comparisons

6) Multicollinearity: this is not to be confused with perfect collinearity. define multicollinearity

some variables might be correlated but it is not perfect. When some explanatory variables are highly correlated.

1) Zero conditional mean assumption: we need this assumption for what?

the unbiasedness of our OLS estimator

1) Zero conditional mean assumption: if E[U|xj] ≠ 0 then ............

then we say we have xj as an endogenous explanatory variable it is correlated with the error term

6) Multicollinearity: #creating additional control variables: what happens to the value for the OLS estimators when we add many variables?

values fall

1) Zero conditional mean assumption: when this assumption holds we have exogenous explanatory variables - explain

uncorrelated with the error term

6) Multicollinearity: sampling variance of OLS slope estimates: var(β^1) =...................

var(β^1) = σ^2 / SSTj (1-R^2j)

6) Multicollinearity: If education and experience (x1,x2) are correlated we have issues. high variance means what?

we do not have enough information for our parameter estimate

5) Including Control variables: why is adding more variables into a regression term problematic?

we want to eliminate the ZCM violation, we want all our variables to be exogenous. By taking things out of the error term and into the model we decrease our change of being endogenous.

3) Omitted variable bias: we can also model the relationship between explanatory variables as what?

x2 = δ~0 + δ^1x1

6) Multicollinearity: how do you solve multicollinearity?

you take out what you want to look at

3) Omitted variable bias: take y^ = β0^ + β1^x1 + β2^x2 y = wage β1 = education β2 = experience we dont observe x2 so we instead estimate what?

y~ = β~0 + β~1x1

2) OLS estimator for R: If we have a semielasticity model, how do we show the impact of an additional unit of an explanatory variable on the dependent variable?

β1 x 100

3) Omitted variable bias: we can also model the relationship between explanatory variables as x2 = δ~0 + δ^1x1 then β~1 = ..................

β~1 = β^ + β^ δ~1

3) Omitted variable bias: what will show us how much our estimator for education is biased?

δ^1 x β^2

6) Multicollinearity: sampling variance of OLS slope estimates: var(β^1) = σ^2 / SSTj (1-R^2j) what is σ^2

σ^2 = variance of error term


Set pelajaran terkait

NSG252-EXAM4-Patho Book CH 16 - Disorders of Brain Function (Stroke, TBI, Seizures, etc)

View Set

Chapter 14 Communicating Customer Value: Integrated Marketing Communications Strategy

View Set

6th Grade The Water Cycle and Influences on Weather and Climate Study Guide

View Set