Chapter 6 - Section 4 - Smartbook - Assumptions of Linear RegressionAssignment

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Standard t or F tests are not valid as they are based on these estimated standard errors.

In the presence of changing variability, the estimated standard errors of the OLS estimators are inappropriate. What does this imply about using standard testing? Multiple choice question. Use standard t or F tests Standard t or F tests are not valid as they are based on these estimated standard errors. We should use standard t tests only We should use F tests only

All of the answers are correct

In the presence of correlated observations, the OLS estimators are unbiased, but their estimated standard errors are inappropriate. Which of the following could happen as a result? Multiple choice question. All of the answers are correct The t test may suggest that the predictor variables are individually and jointly significant when this is not true The model looks better than it really is with a spuriously high R2 The F test may suggest that the predictor variables are individually and jointly significant when this is not true

Drop one of the collinear variables Obtain more data because the sample correlation may get weaker

Select all that apply What is a good solution when confronted with multicollinearity? Multiple select question. Obtain more data because a bigger sample is always better Add another variable Drop one of the collinear variables Obtain more data because the sample correlation may get weaker

True

True or false: Linearity is justified if the residuals are randomly dispersed across the values of a predictor variable.

When positive residuals and negative residuals alternate over a few periods, sometimes positive or negative for a couple of periods.

We can plot the residuals sequentially over time to look for correlated observations. How are violations indicated? Multiple choice question. There is no detection method When all the residuals are negative When positive residuals are shown consistently over time and negative residuals are shown consistently over time When positive residuals and negative residuals alternate over a few periods, sometimes positive or negative for a couple of periods.

The residuals should show no pattern around the horizontal axis.

We can plot the residuals sequentially over time to look for correlated observations. If there is no violation, then what would you see? Multiple choice question. The residuals should show no pattern around the horizontal axis. The residuals should show a normal pattern around the horizontal axis. The residuals should show no pattern around the vertical axis. The residuals should show a normal pattern around the vertical axis.

Decreases

We can use residual plots to gauge changing variability. The residuals are generally plotted against each predictor variable xj. There is a violation if the variability increases or '' over the values of xj.

The residuals are randomly dispersed across the values of xj

We can use residual plots to gauge changing variability.The residuals are generally plotted against each predictor variable xj Which of the following indicates there is no violation? Multiple choice question. The residuals are randomly dispersed across the values of xj The residuals are NOT randomly dispersed across the values of xj There is no way to indicate no violation The predictor variable is randomly dispersed across the residuals

Perfect multicollinearity

What is the condition called when two or more predictor variables have an exact linear relationship? Multiple choice question. Nonzero slope coefficient Nonlinear violation Perfect multicollinearity Model inadequacies

Nothing

When confronted with multicollinearity, the best approach may be to do '' if the estimated model yields a high R2,

Residual

'' plots are used to detect some of the common violations to the regression model assumptions. These graphical plots are easy to use and provide informal analysis of the estimated regression models.

When important predictor variables are excluded.

A crucial assumption in a linear regression model is that the error term is not correlated with the predictor variables. In general, when does this assumption break down? Multiple choice question. When important predictor variables are excluded. The estimated standard errors of the OLS estimators are inappropriate When there are too many variables in the model When the standard errors are distorted downward

Use the adjusted R2 criterion to reduce the list

An important first step before running a regression model is to compile a comprehensive list of potential predictor variables. How can we reduce the list to a smaller list of predictor variables? Multiple choice question. We use R to make the necessary correction We must include all relevant variables Use the adjusted R2 criterion to reduce the list The best approach may be to do nothing

Correlation

If one or more of the relevant predictor variables are excluded, then the resulting OLS estimators are biased. The extent of the bias depends on the degree of the '" between the included and the excluded predictor variables.

Response

If residual plots exhibit strong nonlinear patterns, the inferences made by a linear regression model can be quite misleading. In such instances, we should employ nonlinear regression methods based on simple transformations of the '' and the predictor variables.

unbiased

In the presence of changing variability, the OLS estimators are '', but their estimated standard errors are inappropriate

Conditional on x1, x2,.., xk, the error term ɛ is uncorrelated across observations; or, in statistical terminology, there is no serial correlation. The regression model given by y = β0 + β1x1 + β2x2 +... + βkxk + ɛ is linear in the parameters β0, β1,..., βk.

Select all that apply Which of the following are the assumptions that underlie the classical linear regression model? Please select all that apply! Multiple select question. The error term ɛ is correlated with any of the predictor variables x1, x2,..., xk Conditional on x1, x2,.., xk, the error term ɛ is uncorrelated across observations; or, in statistical terminology, there is no serial correlation. There is an exact linear relationship among the predictor variables; or, in statistical terminology, there is no perfect multicollinearity. The regression model given by y = β0 + β1x1 + β2x2 +... + βkxk + ɛ is linear in the parameters β0, β1,..., βk.

Changing variability

The assumption of constant variability of observations often breaks down in studies with cross-sectional data. Consider the model y = β0 + β1x + ɛ, where y is a household's consumption expenditure and x is its disposable income. It may be unreasonable to assume that the variability of consumption is the same across a cross-section of household incomes. This violation is called: Multiple choice question. Changing variability Correlated Observations Multicollinearity Nonlinear Patterns

High R2 and significant F statistic coupled with insignificant predictor variables

The detection methods for multicollinearity are mostly informal. Which of the following indicate a potential multicollinearity issue? Choose all that apply! Multiple choice question. High R2 plus individually insignificant predictor variables High R2 and significant F statistic coupled with insignificant predictor variables Individually insignificant predictor variables Significant F statistic coupled with individually insignificant predictor variables

β1

The simple linear regression model y = β0 + β1x + ɛ implies that if x goes up by one unit, we expect y to change by how much? (irrespective of the value of x), Multiple choice question. β0 β1 x ɛ

One

The variance inflation factor (VIF) is another measure that can detect a high correlation between three or more predictor variables even if no pair of predictor variables has a particularly high correlation. What is the smallest possible value of VIF? (absence of multicollinearity). Multiple choice question. VIF does not exceed 5 or 10 VIF exceeds 5 or 10 Zero One


Ensembles d'études connexes

AP Psychology: Unit 1B (Practice Test)

View Set

Walter Issacson "Benjamin Franklin: An American Life"

View Set

problem of evil philosophy quiz #6, Augustines Confessions Philosophy Quiz 9, Life without God Philosophy Quiz #7, Intro to Philosophy Quiz # 1, philosophy final

View Set

Final Exam: GI Dysfunction NCLEX Questions

View Set

Chapter 9: Teaching Diverse Learners

View Set