econo quiz 2

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

Heteroskedasticity means that A) homogeneity cannot be assumed automatically for the model. B) the variance of the error term is not constant. C) the observed units have different preferences. D) agents are not all rational.

B

) In the model ln(Yi) = β0 + β1Xi + ui, the elasticity of E(Y|X) with respect to X is A) β1X B) β1 C) D) Cannot be calculated because the function is non-linear

a

All of the following are examples of joint hypotheses on multiple regression coefficients, with the exception of A) H0 : β1 + β2 = 1 B) H0 : = β1 and β4 = 0 C) H0 : β2 = 0 and β3 = 0 D) H0 : β1 = -β2 and β1 + β2 = 1

a

At a mathematical level, if the two conditions for omitted variable bias are satisfied, then A) E(ui X1i, X2i,..., Xki) ≠ 0. B) there is perfect multicollinearity. C) large outliers are likely: X1i, X2i,..., Xki and Yi and have infinite fourth moments. D) (X1i, X2i,..., Xki,Yi), i = 1,..., n are not i.i.d. draws from their joint distribution.

a

Consider a regression with two variables, in which X1i is the variable of interest and X2i is the control variable. Conditional mean independence requires A) E(ui|X1i, X2i) = E(ui|X2i) B) E(ui|X1i, X2i) = E(ui|X1i) C) E(ui|X1i) = E(ui|X2i) D) E(ui) = E(ui|X2i)

a

Consider the estimated equation from your textbook =698.9 - 2.28STR, R2 = 0.051, SER = 18.6 (10.4) (0.52) The t-statistic for the slope is approximately A) 4.38 B) 67.20 C) 0.52 D) 1.76

a

Consider the following least squares specification between testscores and the student-teacher ratio: = 557.8 + 36.42 ln (Income). According to this equation, a 1% increase income is associated with an increase in test scores of A) 0.36 points B) 36.42 points C) 557.8 points D) cannot be determined from the information given here

a

Consider the multiple regression model with two regressors X1 and X2, where both variables are determinants of the dependent variable. When omitting X2 from the regression, then there will be omitted variable bias for A) if X1 and X2 are correlated B) always C) if X2 is measured in percentages D) if X2 is a dummy variable

a

Consider the population regression of log earnings [Yi, where Yi = ln(Earningsi)] against two binary variables: whether a worker is married (D1i, where D1i=1 if the ith person is married) and the worker's gender (D2i, where D2i=1 if the ith person is female), and the product of the two binary variables Yi = β0 + β1D1i + β2D2i + β3(D1i×D2i) + ui. The interaction term A) allows the population effect on log earnings of being married to depend on gender B) does not make sense since it could be zero for married males C) indicates the effect of being married on log earnings D) cannot be estimated without the presence of a continuous variable

a

If the absolute value of your calculated t-statistic exceeds the critical value from the standard normal distribution, you can A) reject the null hypothesis. B) safely assume that your regression results are significant. C) reject the assumption that the error terms are homoskedastic. D) conclude that most of the actual values are very close to the regression line.

a

Imperfect multicollinearity A) implies that it will be difficult to estimate precisely one or more of the partial effects using the data at hand B) violates one of the four Least Squares assumptions in the multiple regression model C) means that you cannot estimate the effect of at least one of the Xs on Y D) suggests that a standard spreadsheet program does not have enough power to estimate the multiple regression model

a

In multiple regression, the R2 increases whenever a regressor is A) added unless the coefficient on the added regressor is exactly zero. B) added. C) added unless there is heterosckedasticity. D) greater than 1.96 in absolute value.

a

In the model Yi = β0 + β1X1 + β2X2 + β3(X1 × X2) + ui, the expected effect is A) β1 + β3X2. B) β1. C) β1 + β3. D) β1 + β3X1.

a

The confidence interval for the sample regression function slope A) can be used to conduct a test about a hypothesized population regression function slope. B) can be used to compare the value of the slope relative to that of the intercept. C) adds and subtracts 1.96 from the slope. D) allows you to make statements about the economic importance of your estimate.

a

The exponential function A) is the inverse of the natural logarithm function. B) does not play an important role in modeling nonlinear regression functions in econometrics. C) can be written as exp(ex). D) is ex, where e is 3.1415....

a

The following linear hypothesis can be tested using the F-test with the exception of A) β2 = 1 and β3= β4/β5. B) β2 =0. C) β1 + β2 = 1 and β3 = -2β4. D) β0 = β1 and β1 = 0.

a

The interpretation of the slope coefficient in the model ln(Yi) = β0 + β1 ln(Xi)+ ui is as follows: A) a 1% change in X is associated with a β1 % change in Y. B) a change in X by one unit is associated with a β1 change in Y. C) a change in X by one unit is associated with a 100 β1 % change in Y. D) a 1% change in X is associated with a change in Y of 0.01 β1.

a

The overall regression F-statistic tests the null hypothesis that A) all slope coefficients are zero. B) all slope coefficients and the intercept are zero. C) the intercept in the regression and at least one, but not all, of the slope coefficients is zero. D) the slope coefficient of the variable of interest is zero, but that the other slope coefficients are not.

a

You extract approximately 5,000 observations from the Current Population Survey (CPS) and estimate the following regression function: = 3.32 — 0.45Age, R2= 0.02, SER = 8.66 (1.00) (0.04) where ahe is average hourly earnings, and Age is the individual's age. Given the specification, your 95% confidence interval for the effect of changing age by 5 years is approximately A) [$1.96, $2.54] B) [$2.32, $4.32] C) [$1.35, $5.30] D) cannot be determined given the information provided

a

) In the log-log model, the slope coefficient indicates A) the effect that a unit change in X has on Y. B) the elasticity of Y with respect to X. C) ΔY / ΔX. D) × .

b

If the errors are heteroskedastic, then A) OLS is BLUE. B) WLS is BLUE if the conditional variance of the errors is known up to a constant factor of proportionality. C) LAD is BLUE if the conditional variance of the errors is known up to a constant factor of proportionality. D) OLS is efficient.

b

In the multiple regression model, the adjusted R2, 2 A) cannot be negative. B) will never be greater than the regression R2. C) equals the square of the correlation coefficient r. D) cannot decrease when an additional explanatory variable is added.

b

In the regression model Yi = β0 + β1Xi + β2Di + β3(Xi × Di) + ui, where X is a continuous variable and D is a binary variable, β2 A) is the difference in means in Y between the two categories. B) indicates the difference in the intercepts of the two regressions. C) is usually positive. D) indicates the difference in the slopes of the two regressions.

b

The critical value of F4,∞ at the 5% significance level is A) 3.84 B) 2.37 C) 1.94 D) Cannot be calculated because in practice you will not have infinite number of observations

b

The general answer to the question of choosing the scale of the variables is A) dependent on you whim. B) to make the regression results easy to read and to interpret. C) to ensure that the regression coefficients always lie between -1 and 1. D) irrelevant because regardless of the scale of the variable, the regression coefficient is unaffected.

b

The homoskedasticity-only F-statistic and the heteroskedasticity-robust F-statistic typically are A) the same B) different C) related by a linear function D) a multiple of each other (the heteroskedasticity-robust F-statistic is 1.96 times the homoskedasticity-only F-statistic)

b

The interpretation of the slope coefficient in the model Yi = β0 + β1 ln(Xi) + ui is as follows: A) a 1% change in X is associated with a β1 % change in Y. B) a 1% change in X is associated with a change in Y of 0.01 β1. C) a change in X by one unit is associated with a β1 100% change in Y. D) a change in X by one unit is associated with a β1 change in Y.

b

The interpretation of the slope coefficient in the model ln(Yi) = β0 + β1Xi + ui is as follows: A) a 1% change in X is associated with a β1 % change in Y. B) a change in X by one unit is associated with a 100 β1 % change in Y. C) a 1% change in X is associated with a change in Y of 0.01 β1. D) a change in X by one unit is associated with a β1 change in Y.

b

The proof that OLS is BLUE requires all of the following assumptions with the exception of: A) the errors are homoskedastic. B) the errors are normally distributed. C) E(ui. D) large outliers are unlikely.

b

Under imperfect multicollinearity A) the OLS estimator cannot be computed. B) two or more of the regressors are highly correlated. C) the OLS estimator is biased even in samples of n > 100. D) the error terms are highly, but not perfectly, correlated.

b

Using 143 observations, assume that you had estimated a simple regression function and that your estimate for the slope was 0.04, with a standard error of 0.01. You want to test whether or not the estimate is statistically significant. Which of the following possible decisions is the only correct one: A) you decide that the coefficient is small and hence most likely is zero in the population B) the slope is statistically significant since it is four standard errors away from zero C) the response of Y given a change in X must be economically important since it is statistically significant D) since the slope is very small, so must be the regression R2.

b

When estimating a demand function for a good where quantity demanded is a linear function of the price, you should A) not include an intercept because the price of the good is never zero. B) use a one-sided alternative hypothesis to check the influence of price on quantity. C) use a two-sided alternative hypothesis to check the influence of price on quantity. D) reject the idea that price determines demand unless the coefficient is at least 1.96.

b

With heteroskedastic errors, the weighted least squares estimator is BLUE. You should use OLS with heteroskedasticity-robust standard errors because A) this method is simpler. B) the exact form of the conditional variance is rarely known. C) the Gauss-Markov theorem holds. D) your spreadsheet program does not have a command for weighted least squares.

b

All of the following are true, with the exception of one condition: A) a high R2 or does not mean that the regressors are a true cause of the dependent variable. B) a high R2 or does not mean that there is no omitted variable bias. C) a high R2 or always means that an added variable is statistically significant. D) a high R2 or does not necessarily mean that you have the most appropriate set of regressors.

c

Consider the following multiple regression models (a) to (d) below. DFemme = 1 if the individual is a female, and is zero otherwise; DMale is a binary variable which takes on the value one if the individual is male, and is zero otherwise; DMarried is a binary variable which is unity for married individuals and is zero otherwise, and DSingle is (1-DMarried). Regressing weekly earnings (Earn) on a set of explanatory variables, you will experience perfect multicollinearity in the following cases unless: A) i = + DFemme + Dmale +X3i B) i = + DMarried + DSingle + X3i C) i = + DFemme + X3i D) i = DFemme + Dmale + DMarried + DSingle + X3i

c

Consider the multiple regression model with two regressors X1 and X2, where both variables are determinants of the dependent variable. You first regress Y on X1 only and find no relationship. However when regressing Y on X1 and X2, the slope coefficient changes by a large amount. This suggests that your first regression suffers from A) heteroskedasticity B) perfect multicollinearity C) omitted variable bias D) dummy variable trap

c

If the estimates of the coefficients of interest change substantially across specifications, A) then this can be expected from sample variation. B) then you should change the scale of the variables to make the changes appear to be smaller. C) then this often provides evidence that the original specification had omitted variable bias. D) then choose the specification for which your coefficient of interest is most significant.

c

If you had a two regressor regression model, then omitting one variable which is relevant A) will have no effect on the coefficient of the included variable if the correlation between the excluded and the included variable is negative. B) will always bias the coefficient of the included variable upwards. C) can result in a negative value for the coefficient of the included variable, even though the coefficient will have a significant positive effect on Y if the omitted variable were included. D) makes the sum of the product between the included variable and the residuals different from 0.

c

Imagine you regressed earnings of individuals on a constant, a binary variable ("Male") which takes on the value 1 for males and is 0 otherwise, and another binary variable ("Female") which takes on the value 1 for females and is 0 otherwise. Because females typically earn less than males, you would expect A) the coefficient for Male to have a positive sign, and for Female a negative sign. B) both coefficients to be the same distance from the constant, one above and the other below. C) none of the OLS estimators to exist because there is perfect multicollinearity. D) this to yield a difference in means statistic.

c

In a two regressor regression model, if you exclude one of the relevant variables then A) it is no longer reasonable to assume that the errors are homoskedastic. B) OLS is no longer unbiased, but still consistent. C) you are no longer controlling for the influence of the other variable. D) the OLS estimator no longer exists.

c

In nonlinear models, the expected change in the dependent variable for a change in one of the explanatory variables is given by A) △Y = f(X1 + X1, X2,... Xk). B) △Y = f(X1 + △X1, X2 + △X2,..., Xk+ △Xk)- f(X1, X2,...Xk). C) △Y = f(X1 + △X1, X2,..., Xk)- f(X1, X2,...Xk). D) △Y = f(X1 + X1, X2,..., Xk)- f(X1, X2,...Xk).

c

In the presence of heteroskedasticity, and assuming that the usual least squares assumptions hold, the OLS estimator is A) efficient. B) BLUE. C) unbiased and consistent. D) unbiased but not consistent.

c

Including an interaction term between two independent variables, X1 and X2, allows for the following except: A) the interaction term lets the effect on Y of a change in X1 depend on the value of X2. B) the interaction term coefficient is the effect of a unit increase in X1 and X2 above and beyond the sum of the individual effects of a unit increase in the two variables alone. C) the interaction term coefficient is the effect of a unit increase in . D) the interaction term lets the effect on Y of a change in X2 depend on the value of X1.

c

The confidence interval for a single coefficient in a multiple regression A) makes little sense because the population parameter is unknown. B) should not be computed because there are other coefficients present in the model. C) contains information from a large number of hypothesis tests. D) should only be calculated if the regression R2 is identical to the adjusted R2.

c

The dummy variable trap is an example of A) imperfect multicollinearity B) something that is of theoretical interest only C) perfect multicollinearity D) something that does not happen to university or college students

c

The following interactions between binary and continuous variables are possible, with the exception of A) Yi = β0 + β1Xi + β2Di + β3(Xi × Di) + ui. B) Yi = β0 + β1Xi + β2(Xi × Di) + ui. C) Yi = (β0 + Di) + β1Xi + ui. D) Yi = β0 + β1Xi + β2Di + ui.

c

The formula for the standard error of the regression coefficient, when moving from one explanatory variable to two explanatory variables, A) stays the same. B) changes, unless the second explanatory variable is a binary variable. C) changes. D) changes, unless you test for a null hypothesis that the addition regression coefficient is zero.

c

The t-statistic is calculated by dividing A) the OLS estimator by its standard error. B) the slope by the standard deviation of the explanatory variable. C) the estimator minus its hypothesized value by the standard error of the estimator. D) the slope by 1.96.

c

Using the textbook example of 420 California school districts and the regression of testscores on the student-teacher ratio, you find that the standard error on the slope coefficient is 0.51 when using the heteroskedasticity robust formula, while it is 0.48 when employing the homoskedasticity only formula. When calculating the t-statistic, the recommended procedure is to A) use the homoskedasticity only formula because the t-statistic becomes larger B) first test for homoskedasticity of the errors and then make a decision C) use the heteroskedasticity robust formula D) make a decision depending on how much different the estimate of the slope is under the two procedures

c

You have collected data for the 50 U.S. states and estimated the following relationship between the change in the unemployment rate from the previous year () and the growth rate of the respective state real GDP (gy). The results are as follows = 2.81 — 0.23gy, R2= 0.36, SER = 0.78 (0.12) (0.04) Assuming that the estimator has a normal distribution, the 95% confidence interval for the slope is approximately the interval A) [2.57, 3.05] B) [-0.31,0.15] C) [-0.31, -0.15] D) [-0.33, -0.13]

c

You have estimated the relationship between testscores and the student-teacher ratio under the assumption of homoskedasticity of the error terms. The regression output is as follows: = 698.9 - 2.28 × STR, and the standard error on the slope is 0.48. The homoskedasticity-only "overall" regression F- statistic for the hypothesis that the Regression R2 is zero is approximately A) 0.96 B) 1.96 C) 22.56 D) 4.75

c

You have to worry about perfect multicollinearity in the multiple regression model because A) many economic variables are perfectly correlated. B) the OLS estimator is no longer BLUE. C) the OLS estimator cannot be computed in this situation. D) in real life, economic variables change together all the time.

c

(Requires Calculus) In the multiple regression model you estimate the effect on Yi of a unit change in one of the Xi while holding all other regressors constant. This A) makes little sense, because in the real world all other variables change. B) corresponds to the economic principle of mutatis mutandis. C) leaves the formula for the coefficient in the single explanatory variable case unaffected. D) corresponds to taking a partial derivative in mathematics.

d

An example of the interaction term between two independent, continuous variables is A) Yi = β0 + β1Xi + β2Di + β3(Xi × Di) + ui. B) Yi = β0 + β1X1i + β2X2i + ui. C) Yi = β0 + β1D1i + β2D2i + β3 (D1i × D2i) + ui. D) Yi = β0 + β1X1i + β2X2i + β3(X1i × X2i) + ui.

d

Assume that you had estimated the following quadratic regression model = 607.3 + 3.85 Income - 0.0423 Income2. If income increased from 10 to 11 ($10,000 to $11,000), then the predicted effect on testscores would be A) 3.85 B) 3.85-0.0423 C) Cannot be calculated because the function is non-linear D) 2.96

d

Consider the following regression output where the dependent variable is testscores and the two explanatory variables are the student-teacher ratio and the percent of English learners: = 698.9 - 1.10×STR - 0.650×PctEL. You are told that the t-statistic on the student-teacher ratio coefficient is 2.56. The standard error therefore is approximately A) 0.25 B) 1.96 C) 0.650 D) 0.43

d

Consider the polynomial regression model of degree Yi = β0 + β1Xi + β2+ ...+ βr + ui. According to the null hypothesis that the regression is linear and the alternative that is a polynomial of degree r corresponds to A) H0: βr = 0 vs. βr ≠ 0 B) H0: βr = 0 vs. β1 ≠ 0 C) H0: β3 = 0, ..., βr = 0, vs. H1: all βj ≠ 0, j = 3, ..., r D) H0: β2 = 0, β3 = 0 ..., βr = 0, vs. H1: at least one βj ≠ 0, j = 2, ..., r

d

For a single restriction (q = 1), the F-statistic A) is the square root of the t-statistic. B) has a critical value of 1.96. C) will be negative. D) is the square of the t-statistic.

d

Imperfect multicollinearity A) is not relevant to the field of economics and business administration B) only occurs in the study of finance C) means that the least squares estimator of the slope is biased D) means that two or more of the regressors are highly correlated

d

In the case of regression with interactions, the coefficient of a binary variable should be interpreted as follows: A) there are really problems in interpreting these, since the ln(0) is not defined. B) for the case of interacted regressors, the binary variable coefficient represents the various intercepts for the case when the binary variable equals one. C) first set all explanatory variables to one, with the exception of the binary variables. Then allow for each of the binary variables to take on the value of one sequentially. The resulting predicted value indicates the effect of the binary variable. D) first compute the expected values of Y for each possible case described by the set of binary variables. Next compare these expected values. Each coefficient can then be expressed either as an expected value or as the difference between two or more expected values.

d

In the regression model Yi = β0 + β1Xi + β2Di + β3(Xi × Di) + ui, where X is a continuous variable and D is a binary variable, to test that the two regressions are identical, you must use the A) t-statistic separately for β2 = 0, β2 = 0. B) F-statistic for the joint hypothesis that β0 = 0, β1 = 0. C) t-statistic separately for β3 = 0. D) F-statistic for the joint hypothesis that β2 = 0, β3= 0.

d

The homoskedastic normal regression assumptions are all of the following with the exception of: A) the errors are homoskedastic. B) the errors are normally distributed. C) there are no outliers. D) there are at least 10 observations.

d

Under the least squares assumptions (zero conditional mean for the error term, Xi and Yi being i.i.d., and Xi and ui having finite fourth moments), the OLS estimator for the slope and intercept A) has an exact normal distribution for n > 15. B) is BLUE. C) has a normal distribution even in small samples. D) is unbiased.

d

When testing joint hypothesis, you should A) use t-statistics for each hypothesis and reject the null hypothesis is all of the restrictions fail. B) use the F-statistic and reject all the hypothesis if the statistic exceeds the critical value. C) use t-statistics for each hypothesis and reject the null hypothesis once the statistic exceeds the critical value for a single hypothesis. D) use the F-statistics and reject at least one of the hypothesis if the statistic exceeds the critical value.

d

When there are omitted variables in the regression, which are determinants of the dependent variable, then A) you cannot measure the effect of the omitted variable, but the estimator of your included variable(s) is (are) unaffected. B) this has no effect on the estimator of your included variable because the other variable is not included. C) this will always bias the OLS estimator of the included variable. D) the OLS estimator is biased if the omitted variable is correlated with the included variable.

d

When you have an omitted variable problem, the assumption that E(ui Xi) = 0 is violated. This implies that A) the sum of the residuals is no longer zero. B) there is another estimator called weighted least squares, which is BLUE. C) the sum of the residuals times any of the explanatory variables is no longer zero. D) the OLS estimator is no longer consistent.

d


Kaugnay na mga set ng pag-aaral

Chapter 7 Collaboration Information Systems

View Set

19.6: The Age of Napoleon (1799-1815)

View Set

Wiley Plus Chapter 24,26,28 Questions

View Set

ORGANISATIONS INTERNATIONALES ET INSTITUTIONS INTERNATIONALES

View Set

Nursing as a Career Final study guide

View Set