Econometrics Final

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

What is the meaning of Z in combating the effect of Heteroskedasticity?

The variable Z is a proportionality factor because the variance of the error term changes proportionally to Z.

What is the equation of the Binomial Logit Model?

Pi = 1/(1+ e-(Bo + B1X1 + B2X2))

What is the Cochrane-orcutt and Prais-Winsten method? How are they used and what for?

Techniques used to estimate GLS equations. Prais-Winsten: two step iterative technique that rids equation of serial correlation by first producing an estimate of p and using that p to estimate GLS equation.

What is a dummy dependent variable? What are 2 examples?

Also known as qualitative dependent variables, includes things like why some states have female governors and others dont, or what determines pepsi vs coke drinkers?

What are the 3 properties of 2SLS?

2SLS estimates still are biased. (But this bias is generally smaller than bias due to OLS) If the fit of the reduced form equation is poor, then 2SLS will not rid the equation of bias. 2SLS estimates have increased variances and SE(B)s.

What is Multicollinearity?

A linear correlation between two or more independent variables

What is the linear probability model and how do we interpret its coefficients?

A linear in the coefficients equation used to explain a dummy dependent variable. A Di of .10 can best be thought of as a state in which there is a 10% chance that the governor will be female, based on the states values for the independent variables. A coefficient in a linear probability model is an estimate of the change in the probability that Di = 1 caused by a one unit increase in the independent variable in question, holding other variables constant.

What is 2SLS and how does it work?

A method of avoiding simultaneity bias by systematically creating variables to replace the endogenous variables where they appear as explanatory variables in simultaneous equations systems. Runs an OLS regression on the reduced form of every right-side endogenous variable and then using the Y from the reduced form estimated equation in place of the endogenous variable where it appears on the right side of a structural equation.

What is the order condition? How do we use it to determine if an equation is identified?

A systematic method of determining whether a particular equation in a simultaneous system has the potential to be identified. A necessary condition for an equation to be identified is that the number of predetermined variables in the system be greater than or equal to the number of slope coefficients in the equation of interest. For each equation in the system, must determine: The number of predetermined variables in the entire simultaneous system. The number of slope coefficients estimated in the equation in question.

What is a Simultaneous equation? Give an example of one.

A two way relationship between variables, (Supply and Demand, population size and food supply, wages and prices) A joint determination between Y and its effect on at least one of the X's and additionally the effect the X's have on the Y.

What is an instrumental variable, how is it used, and what conditions must it satisfy?

An instrumental variable is a variable that is highly correlated with the endogenous variable and uncorrelated with the error term. It is a method of avoiding the violation of classical assumption III by producing predicted values of the endogenous variables that can be substituted for the endogenous variables where they appear on the right hand side of structural equations.

What is 'Maximum Likelihood' (ML) and how is it used in the Binomial Logit model?

An iterative estimation technique that is especially useful for equations that are nonlinear in the coefficients

What are the two tests for Heteroskedasticity?

Breusch-Pagan and White Test

What type of bias in the standard errors does Heteroskedasticity cause?

Causes OLS estimates of the SE(B)s to be biased downward (too small), results in a t-score that is too high, and increases the likelihood that we reject the Ho, more likely to make a type 1 error, more likely to keep an irrelevant variable in the equation, more likely to think that estimate is more precise than it really is (confidence interval decreases).

What does centering explanatory variables do and not do in alleviating multicollinearity?

Centering explanatory variables by subtracting the mean from the variable causes the regression output to have the same R2 and exactly the same predictions, but lowers the correlations between the independent variables to decrease multicollinearity and lower the VIF to make it more attractive.

What are the 3 ways we can use the coefficients to measure the impact of an independent variable on the probability that Di=1?

Change an average observation Use a partial derivative Use a rough estimate of 0.25

What is the idea of a structural equation?

Characterize the underlying economic theory behind each endogenous variable by expressing it in terms of both endogenous and exogenous variables. Feedback loops

What is the Lagrange multiplier (Breusch Godfrey) and what does it do?

Checks for serial correlation by analyzing how well the lagged residuals explain the residual of the original equation in an equation that also includes all of the explanatory variables of the original model.

What type of model are we more likely to see the effects of heteroskedasticity?

Cross-sectional, but can be time-series

What do the numerical outputs of d mean?

D=0, extreme positive serial correlation D=4, extreme negative serial correlation D=2, no serial correlation

What effect does a dummy dependent variable have on the R of the regression? How can we obtain a more accurate R?

Decreases R significantly Rp is the average of the percentage of 1s explained correctly and the percentage of 0s explained correctly.

What does the Durbin Watson test determine and how does it do it?

Determines if there is first-order serial correlation in the error term of an equation by examining the residuals of a particular estimation of that equation.

What are the 3 reasons for using a reduced form equation?

Do not violate assumption III and can be estimated using OLS Interpretation of coefficients as impact multipliers means they have economic meaning and useful applications. Crucial role in estimation technique, 2SLS

What are remedies of multicollinearity?

Do nothing Drop a redundant variable (based on theory, not numbers) Increase sample size

What is a reduced form equation?

Equations that express a particular endogenous variable solely in terms of an error term and all the predetermined variables in the simultaneous system.

What are the two steps to using the Prais-Winsten method?

Estimate p by running a regression based on the residuals of the equation suspected of having serial correlation: et = pet-1+ ut Use this p to estimate the GLS equation by substituting p into equation 9.21 and using OLS to estimate 9.21 with the adjusted data. 9.21: Yt - pYt-1 = Bo(1-p) + B1(X1t - pX1t-1) + ut

What are the 5 consequences of multicollinearity? What is the principal consequence?

Estimates will remain unbiased The variances and standard errors of the estimates will increase The computed t-scores will fall Estimates will become very sensitive to changes in the specification The overall fit of the equation and the estimation of the coefficients of non multicollinear variables will be largely unaffected.

What is the Binomial Probit model and how does it work?

Estimation technique for equations with dummy dependent variables that avoids the unboundness problem by using a variant of the cumulative normal distribution.

What does it mean for an equation to be exactly identified, underidentified, and overidentified?

Exactly identified: # of predetermined variables = # of slope coefficients Overidentified: # predetermined variables > # slope coefficients Underidentified: # of predetermined variables < # slope coefficients 2SLS cannot be applied when underidentified

What is simultaneity bias? What is a real world example of it?

Expected value of OLS estimated Bs not equal to true Bs. Monte Carlo example.

What is first-order serial correlation and second-order?

First-order refers to the current value of the error term being a function of the previous time period. Second order refers to more than one previous time period.

What is Heteroskedasticity and what classical assumption does it violate?

Heteroskedasticity violates classical assumption V, that observations of the error term are drawn from a distribution that has constant variance.

When do we use the White test over the Breusch-Pagan test? How does it work?

If not certain X's are the only proportionality factors → use White test Investigates possibility of heteroskedasticity in an equation by seeing if the squared residuals can be explained by the equations independent variables, their squares, and cross-products.

What is a predetermined variable?

Includes all exogenous variables and any lagged endogenous variables

What is the identification problem?

It is a precondition for the application of 2SLS to equations in simultaneous systems; a structural equation is identified only when enough of the system's predetermined variables are omitted from the equation in question to allow that equation to be distinguished from all the others in the system.

What is the Binomial Logit Model and what is it used for?

It is an estimation technique used for equations with dummy dependent variables that avoids the unboundness problem of the linear probability model by using a variant of cumulative logistic function. Nonlinear

What effect does impure serial correlation have on the bias of SE(B) and t-scores? How does it impact the likelihood to reject or fail to reject Ho, and what does rejecting the Ho mean?

It underestimates the SE(Bs) and overestimates the t-score, making it more likely to reject the Ho that B<0, and more likely we keep the irrelevant variable.

What classical assumption does multicollinearity violate?

It violates classical assumption VI that no explanatory variable is a perfect linear function of other explanatory variables

What is an example in which multicollinearity will occur?

Kilometers and Miles, percent of voters who did vote vs percent who didn't

How do we determine if we reject or fail to reject Ho based on the Lagrange multiplier?

LM=NR^2 If LM> critical chi-square value from table, Reject Ho, conclude there is serial correlation

What are the possible remedies of serial correlation?

Look at specification If you have pure serial correlation -> GLS

What is the Breusch-Pagan test and how does it work?

Method for testing for heteroskedasticity in the error term by investigating whether the squared residuals can be explained by possible proportionality factors.

What is Generalized Least Squares (GLS) and what is it used for?

Method of ridding an equation of pure first-order serial correlation and in the process restoring the minimum variance property to its estimation. Result: Error term is not serially correlated. OLS will be minimum variance. The slope coefficient B1 is the same as the slope coefficient of the originally serially correlated equation. Dependent variable has changed relative to original equation. GLS R^2 not comparable to OLS R^2.

What are the 3 steps to determining positive serial correlation? What do the outcomes of the d-stat relative to its lower and upper bounds indicate in terms of rejecting or failing to reject the Ho?

Obtain the OLS residuals from the equation and calculate d-stat Determine the sample size and the number of explanatory variables and consult the table to find upper and lower critical d value. Given Ho of no positive serial correlation and one-sided alternative hypothesis: Ho: p<0, no serial correlation Ha: p>0, positive serial correlation If d<dL, reject Ho If d>dU, fail to reject Ho

What are the 3 steps to performing a Breusch-Pagan test?

Obtain the residuals from the estimated regression equation. Use the squared residuals as the dependent variable in an auxiliary equation Test the overall significance of auxiliary equation with a chi square test. Ho: B1=B2=0 (Homoskedasticity) Ha: Ho false

What are the 3 steps to running the White test?

Obtain the residuals of the estimated regression equation Estimate an auxiliary regression equation, using the squared residuals as the dependent variable, with each X from the original equation, the square of each X, and the product of each X times every other X as the explanatory variables. Test the overall significance of auxiliary equation with a chi squared test.

What is the special case of perfect multicollinearity and the effect of a 'dominant' variable?

Occurs when a variable is definitionally related to the dependent variable is included as an independent variable in a regression equation. A dominant variable is so highly correlated with the dependent variable that it completely masks the effects of all other independent variables in the equation.

What is the meaning of p and what do each of its values indicate?

P is the first-order autocorrelation coefficient that measures the functional relationship between the value of an observation of the error term and the value of the previous observation of the error term. If p=0, no serial correlation. As p approaches 1 in absolute value, the value of the previous observation of the error term becomes more important in determining the current value of the error term, and a high degree of serial correlation exists. A negative p means error term switches signs in consecutive time periods. -1<=p<=1

What's the difference between perfect and imperfect multicollinearity?

Perfect multicollinearity is when an independent variable can be completely explained by the movement of one or more independent variables. Imperfect can be defined as a linear functional relationship between two or more independent variables that is so strong that it can significantly affect the estimation of coefficients of the variables. Perfect is rare, imperfect much more common.

What are the 3 consequences of Heteroskedasticity?

Pure Hetero does not cause bias in the coefficient estimates Hetero typicall causes OLS to no longer be the minimum variance estimator Hetero causes the OLS estimates of the SE(B)s to be biased, leading to unreliable hypothesis testing and confidence intervals.

What is the difference between pure and impure Heteroskedasticity?

Pure Heteroskedasticity is a function of the error term of a correctly specified regression equation. Impure is by a specification error (omitted variable: The portion of the omitted effect not represented by one of the included explanatory variables must be absorbed by the error term.)

What is the difference between pure and impure serial correlation?

Pure is when assumption IV is violated in a correctly specified equation. Impure is caused by specification error (omitted variable)

What are the consequences of impure serial correlation?

Pure serial correlation does not cause bias in the coefficient estimates. Serial correlation causes OLS to no longer be the minimum variance estimator. Serial correlation causes the OLS estimates of SE(Bs) to be biased, leading to unreliable hypothesis testing.

What are the 3 problems associated with the Linear Probability model?

R is not an accurate measure of overall fit. Di is not bounded by 0 and 1 The error term is neither homoskedastic, nor normally distributed.

What are the two stages to using 2SLS?

Run OLS on the reduced form equations for each of the endogenous variables that appear as explanatory variables in the structural equations in the system. Substitute the reduced form Ys for the Ys that appear on the right side of the structural equations, and then estimate these revised structural equations.

What are HC corrected SE's?

SE(B)s that have been calculated specifically to avoid the consequences of heteroskedasticity. (almost identical to Newey-West SE) Typically larger than OLS standard errors → lower t-score → decreases probability that estimated coefficient will be significantly different from 0. Works best in larger samples, best to avoid in small ones.

What are Newey West Standard Errors and how do they provide remedy to Serial Correlation consequences?

SE(B)s that take account of serial correlation without changing Bs themselves in any way. Estimate is still biased but more accurate Newey-West SE(B)s are typically greater than OLS estimated SE(B)s, thus producing lower t-scores and decreasing the probability that a given estimated coefficient will be significantly different from 0.

How do we determine if we reject or fail to reject the Ho based on the Breusch-Pagan output?

Test statistic is NR2, with a chi squared distribution with degrees of freedom = to number of slope coefficients in the auxiliary regression If NR2 >= to critical chi square value, reject Ho of homoskedasticity

What does a higher or lower Z indicate?

The higher the value of Z, the higher the variance of the distribution of the ith observation of the error term

What characteristics of time-series data make them more difficult than cross-sectional?

The order of observations in time-series is fixed (chronological) Time-series data samples tend to be much smaller than cross-sectional data samples The theory underlying time-series analysis is more complex The stochastic error term in time-series is often affected by the events that took place in a previous time period

What are impact multipliers and what do they measure?

The reduced form coefficients that measure the impact on the endogenous variable due to a one unit increase in the value of the predetermined variable.

What is the inconclusive region with the Durbin Watson test?

The region in which d is inconclusive. dL < d < dU

What are the assumptions that must be met to use the Durbin-Watson test?

The regression model includes an intercept term The serial correlation is first-order in nature The regression model does not include a lagged dependent variable as an independent variable.

What does the VIF test and how?

The variance inflation factor (VIF) is a method of detecting the severity of multicollinearity by looking at the extent to which a given explanatory variable can be explained by all the other explanatory variables in the equation. It is an index of how much multicollinearity has increased the variance of an estimate coefficient. 1. Run an OLS regression that has Xi as a function of all other explanatory variables in the equation. 2. Calculate the VIF for Bi: VIF = 1/(1-R^2i) Rule of thumb is if Bi > 5, multicollinearity is severe.

What are potential remedies to Heteroskedasticity?

Think through specification, are there any omitted variables HC standard errors Redefining variables (underlying theory of regression)

What is the concept of time-series?

Time-series involves a single entity over multiple points in time

What assumption does Serial Correlation violate?

Violates classical assumption IV that different observations of the error term are uncorrelated with each other

What classical assumption do Simultaneous Equations violate and how?

Violates assumption III that the error term and each explanatory variable must be uncorrelated. If there is such a correlation, OLS regression estimation program is likely to attribute to the explanatory variable variations in the dependent variable that are actually being caused by the variations in the error term. Causes biased estimates

When do we reject Ho given White test output and what does rejecting Ho imply?

When NR2 > chi-square critical value, reject Ho of homoskedasticity

What is Serial Correlation?

When different observations of the error term are correlated with each other.

What are multinomial models?

When there are more than 2 qualitative choices available.

What is an endogenous and exogenous variable?

Y's are endogenous and X's are exogenous

What is an example of Z?

Z could be the population for large or small states (California vs Rhode Island, A 10% change in spending for California results in a lot more money than a 10% change for Rhode Island)


संबंधित स्टडी सेट्स

Ch. 38 Intracranial regulation/neuro disorders

View Set

Biology Assignment 1: Cell theory

View Set

MIS 410 Mid Term Study Guide Chapter 6

View Set

Blinn General psychology PsychSim6 Quiz: Short-Term Memory

View Set

MCAT Psych/Sociology, Psych/SocMCAT

View Set