Chapter 13
the standard error of the estimate
measures how the data points vary around the regression line
T/F: two degrees of freedom are lost in simple two-variable regression analysis in computation of error variance because we have to estimate both the intercept and slope-term values.
true
SSE
variance from error
SSR
variance from regression
the best regression line always goes through
xbar, ybar
dummy variable
a special type of variable that only has a value of one or zero.
interpolating
estimating within the known range of the independent variable.
T/F: If the plot of the residuals is fan shaped, the assumption of independence of errors is violated
false
T/F: the Y-intercept (b0) represents the change in estimated average Y per unit change in X
false
T/F: the least squares method minimizes SSR, "sum of squares regression."
false it minimizes the squared errors
T/F: the residual represents the discrepancy between the observed independent variable and its predicted value
false observed dependent variable and its predicted value based on the regression intercept and slope term
T/F: In performing a regression analysis involving two numerical variables, we are assuming that X and Y share a common variance
false the variation around the line of regression is the same for each x value
T/F: In a simple linear regression model, r and b1 can possibly have opposite signs
false they must have the same sign
coefficient of determination
is a measure of the amount of variability in the dependent variable that has been explained by the independent variables (R^2)
extrapolating
making a prediction beyond our known range of the independent variable
ordinary least squares regression
the regression line is the line that minimizes the sum of squared deviations between each data point and the regression line
T/F: Autocorrelation of residuals is exhibited by patterns of plus and minus variation in residuals about the regression line. This indicates that the errors are not independent, an important assumption of regression analysis.
true
T/F: Data that exhibit an autocorrelation effect violate the regression assumption of independence of error terms.
true
T/F: Increasing variation in the dependent variable, Y, as X increases violates an important assumption of regression analysis
true
T/F: When we say that a simple linear regression model is "statistically" useful we mean that the model is a better predictor of Y than the sample mean,
true
T/F: the Regression Sum of Squares (SSR) can never be greater than the Total Sum of Squares (SST), but when SSR=SST, r2 = 1.0
true
T/F: the coefficient of determination (r2) tells us the proportion of total variation that is explained
true
T/F: the residuals or errors represent the difference between the actual Y values and the predicted Y values.
true
T/F: the slope (b1) is the first-derivative of the linear regression equation
true
T/F: the slope term of the regression Y on X times the slope term of the regression X on Y is equal to the correlation coefficient of X and Y squared.
true