Chapter 30: Multiple Regression
In a multiple regression model, a test of significance for the regression equation is not necessary if all the independent variables prove significant.
FALSE
In multiple regression, each individual coefficient must have an R-squared of 0.05 or greater. If not, the coefficient is not significant.
FALSE
In multiple regression, the intercept must be tested to determine if it is significant.
FALSE
The concept of multicollinearity suggests the situation where the independent variable is correlated with the dependent variable.
FALSE
The intercept in a multiple linear regression model must always be positive.
FALSE
The only consequence of multicollinearity is that the intercept can not be considered significant.
FALSE
The value of R-squared represents the unexplained variation for a regression equation.
FALSE
When a linear equation is appropriate, the distributions of data around the regression line should suggest a normal distribution. The data should be centered on the regression line with an increasing level of variance across the range of the chart.
FALSE
When multicollinearity is present, one of the variables must be eliminated.
FALSE
When residuals from a regression plot suggest a parabolic shape around the zero line of the residuals plot, this suggests that a linear fit to the data will produce the highest R-squared.
FALSE
Predictions when using a multiple regression equation can take which of the following forms.
Point estimate, Interval estimate
A multiple regression model that has been shown to suffer from multicollinearity can still be used for prediction purposes.
TRUE
As long as the residuals create a random pattern around the zero line in a residual plot, the linear equation can be considered an appropriate fit to the data.
TRUE
In a regression plot of an independent and dependent variable, a residual represents the difference between the actual value and the forecast result for a specific value of the independent variable .
TRUE
In regression, a two-tailed p-value test is used to determine if the coefficient of each independent variable is significant.
TRUE
The interpolation forecast represents a forecast using the actual data from which the regression equation was computed.
TRUE
To test for multicollinearity one variable can be plotted against the other.
TRUE
When adding additional variables to the multiple regression model, the variables that were originally in the equation may change from being significant to non-significant.
TRUE
When an independent variable in a multiple regression model includes a value of X to a higher power such as X squared, and this model produces a higher value of R-squared than a linear model, this suggests that the residual plot for the linear equation would not produce a random pattern around the zero line of the residual plot.
TRUE
When introducing additional variables to the multiple regression model, the R-squared may not increase.
TRUE
When multicollinearity is present and the problem is traced to two independent variables, the coefficients of these variables cannot be relied upon when determining the effect of a unit change of these variables on the dependent variable.
TRUE
When testing for multicollinearity, a regression can be run in which one of the suspected variables is the dependent variable and the other is the independent variable.
TRUE
When two variables show a high level of correlation to one another, they can be said to be proxies for each other.
TRUE
When the variability of the dependent variable is increasing across the range of the independent variable, this is referred to as ______________ .
heteroscedasticity
The interval estimate computed from a regression equation makes use of the _______________________.
residual standard deviation