ECO6415 Exam 2
In a multiple regression model, the following statistics are given: SSE = 100, R2 = 0.995, k = 5, and n = 15. Then, the coefficient of determination adjusted for degrees of freedom is:
.992
When all the actual values of y are equal to their predicted values, the standard error of estimate will be:
0
In the least squares regression line ŷ = 3 - 2x, the predicted value of y equals: 1.0 when x = -1.0 2.0 when x = 1.0 2.0 when x = - 1.0 1.0 when x = 1.0
1.0 when x = 1.0
In testing the hypotheses: H0: \beta β 1 = 0 vs. H0: \beta β 1 \ne ≠ 0, the following statistics are available: n = 10, b0 = -1.8, b1 = 2.45, sb1 = 1.20, and ŷ = 6. The value of the test statistic is:
2.04
In a regression problem, if the coefficient of determination is 0.95, this means that:
95% of the variation in y can be explained by the variation in x.
In multiple regression analysis, the ratio MSR/MSE yields the:
F statistic for testing the validity of the regression equation
In testing the significance of a multiple regression model with three independent variables, the null hypothesis is H0: β1=β2=β3 T/F
False
In simple linear regression, most often we perform a two-tail test of the population slope \beta β 1 to determine whether there is sufficient evidence to infer that a linear relationship exists. The null hypothesis is stated as:
H0: B1 = 0
A regression analysis between sales (in $1,000) and advertising (in $1,000) resulted in the following least squares line: ŷ = 80 +5x. This implies that: as advertising increases by $1,000, sales increases by $5,000. as advertising increases by $1,000, sales increases by $80,000. as advertising increases by $5, sales increases by $80. None of these choices.
as advertising increases by $1,000, sales increases by $5,000.
A multiple regression model is assessed to be poor if the error sum of squares SSE and the standard error of estimate sε are both large, the coefficient of determination R2 is close to 0, and the value of the test statistic F is large. T/F
false
Multicollinearity is present when there is a high degree of correlation between the dependent variable and any of the independent variables. T/F
false
Multicollinearity will result in excessively low standard errors of the parameter estimates reported in the regression output. T/F
false
There is more error in estimating a mean value of y as opposed to predicting an individual value of y. T/F
false
When an explanatory variable is dropped from a multiple regression model, the coefficient of determination can increase. T/F
false
Another name for the independent variable in a regression equation is explained variable. T/F
false; explanatory variable
We check for normality by drawing a pie chart of the residuals. T/F
false; scatter plot
The spread in the residuals should increase as the predicted value of y increases. T/F
false; should remain constant
In a simple linear regression model the y variable is used to explain the variation in the x variable. T/F
false; x is used to explain y
We check for normality by drawing a(n) ____________________ of the residuals.
histogram
When the variance of the error variable ε is a constant no matter what the value of x is, this condition is called:
homoscedasticity
The confidence interval estimate of the expected value of y for a given value x, compared to the prediction interval of y for the same given value of x and confidence level, will be: wider the same narrower impossible to know
narrower
The standardized residual is defined as:
residual divided by the standard error of estimate
In order to estimate with 95% confidence the expected value of y for a given value of x in a simple linear regression problem, a random sample of 10 observations is taken. Which of the following t values listed below would be used?
t = 2.306
Given the least squares regression line ŷ = 5 - 2x: the relationship between x and y is positive. the relationship between x and y is negative. as x decreases, so does y. None of these choices.
the relationship between x and y is negative
A high value of the coefficient of determination significantly above 0 in multiple regression, accompanied by insignificant t-statistics on all parameter estimates, very often indicates a high correlation between independent variables in the model. T/F
true
A regression analysis between weight (y in pounds) and height (x in inches) resulted in the following least squares line: ŷ = 135 + 6x. This implies that if the height is increased by 1 inch, the weight is expected to increase by an average of 6 pounds. T/F
true
An outlier is an observation that is unusually small or unusually large. T/F
true
Data that exhibit an autocorrelation effect violate the regression assumption of independence. true/false
true
In multiple regression analysis, the adjusted coefficient of determination is adjusted for the number of independent variables and the sample size. T/F
true
In reference to the equation ŷ = -0.80 + 0.12x1 + 0.08x2, the value 0.12 is the average change in y per unit change in x1, when x2 is held constant. T/F
true
One method of diagnosing heteroscedasticity is to plot the residuals against the predicted values of y, then look for a change in the spread of the plotted values. T/F
true
The first-order linear model is sometimes called the simple linear regression model. T/F
true
The method of least squares requires that the sum of the squared deviations between actual y values in the scatter diagram and y values predicted by the regression line be minimized. T/F
true
The plot of residuals vs. predicted values should show no patterns if the conditions of a regression analysis are met.
true
The regression line ŷ = 2 + 3x has been fitted to the data points (4, 11), (2, 7), and (1, 5). The sum of squares for error will be 10.0. T/F
true
The residual ei is defined as the difference between the actual value yi and the estimated value ŷi. T/F
true
The value of the sum of squares for regression SSR can never be smaller than 0.0. T/F
true
The variance of the error variable is required to be constant. When this requirement is violated, the condition is called heteroscedasticity. T/F
true
The standard error of estimate sε is a measure of the:
variation of y around the regression line