Chapter 12: Prediction

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

prediction using z-scores versus raw scores

- you can do predictions based on either Z scores or raw scores. you can use Z scores for a predictor variable to predict Z scores for a criterion variable, or you can use raw scores for a predictor variable to predict raw scores for a criterion variable.

how to draw the regression line

1. Draw and label the axes for a scatter diagram. put the predictor variable on the horizontal axis. 2. figure the predicted value on the criterion variable for a low value of the predictor variable and mark the point on the graph. use the linear prediction rule 3. do the same thing again, for a high value on the predictor variable. 4. draw a line that passes through the two marks.

how to draw the regression line

1. draw and label the axes for a scatter plot putting the predictor variable (i.e given) variable on the x-axis and the criterion variable (predicted) variable on the y-axis. 2. on both the x and y axis, the values indicated are the means and plus and minus on and two standard deviations. we use the to extremes of the mean (plus two S.D and minus SD) bc it will make the dots on the graph fairly far apart and your drawing will be more accurate. 3. start with x axis. put a hash mark in center of line and label it with the mean value for your given variable. -draw two hash marks to the left of the. mean and to the right to he mean. each has mark represents a SD away from the mean. if you are moving left, you are subtracting a standard deviation away from the mean. if you are moving to the right, you are adding a standard deviation to the mean. -do same thing for criterion on y axis. 4. predict scores using Oreo-cookie approach. your "given" values (number you start with) will be you two most extreme given values on your horizontal axis (x bar -2 and bar +2). - first point will be x bar -2 -second point will be x bar +2 -(given, predicted) 5. plot points on the graph and connect the dots. pay attention to correlation and slope of line. if your correlation is negative, your regression line will go down from left to right.

how to predict scores: the 3 step or Oreo-cookie approach

correlations are really a summary of the matched-ness of z-scores. -we can predict a person's score on any one variable (e.g x) if we know their score on the second (y) Oreo-cookie approach: the first and third step are basically the same formulas (chocolate parts of the cookie) and the second step (the cream center) is the most important. 1. predict a Y value when given an X value 2. predict an x value when given a y value 3 steps: 1. find the z-score for your given variables 2. find the predicted x-score for the variable you're predicting 3. find the predicted score. prediction: -if we know Xi --> Yi' -if we know Yi--> Xi'

prediction in research articles

it is rare for bivariate linear prediction rules to be reported in psychology research articles, simple correlations are reported. -sometimes you will see regression lines from bivariate predictions. usually done when there is more than one group and the researcher wants to illustrate the difference in the linear prediction rule between two groups. -multiple regression results are common in research articles and are often reported in tables. -usually the table lays out the regression coefficient for each predictor variable, the table may also give the correlation coefficient (r) for each predictor variable with the criterion variable.

regression line

line on a graph showing the predicted value of the criterion variable for each value of the predictor variable; visual display of the linear prediction rule. -the linear prediction rule is a line on a graph in which the horizontal axis is for the values of the predictor variable (X) and the vertical axis is for predicted scores for the criterion variable (Y hat) -shows the relation between values of the predictor variable and the predicted values of the criterion variable.

the linear prediction rule

or linear prediction model. formula for making predictions; formula for predicting a person's score on a criterion variable based on the person's score on one or more predictor variables. -linear: lows go with lows, highs with highs (or negative: lows with highs, highs with lows) Y= a + (b)(X) Y hat= the person predicted score on the criterion variable. "predicted value of" a= the regression constant: particular fixed number added into the prediction. b= the regression coefficient: number multiplied by a person's score on a predictor variable X= the person's score on the predictor variable

slope of the regression line

steepness of the angle of the regression line, number of units the line goes up for every unit it goes down. -the slope of the line is exactly b, the regression coefficient.

error

the difference between a prediction rule's predicted score on the criterion variable and a person's actual score on the criterion variable. -we want as little error as possible, so we want the smallest sum of errors. -positive rule: the rule will predict too low -negative rule: the rule will predict too high -the positive and negative errors will cancel each other out. -to avoid this problem: we use squared errors. we take each amount of error and square it (multiply it by itself), then we add up the squared errors. (same solution we used to figure out variance and standard deviation)

how good are predictions?

the linear prediction rule you have learned provides a useful and practical way of making predictions on a criterion variable from scores on a predictor variable. problem: researchers often use slightly different measures or scales. this can make it hard to compare the linear prediction rule fo the same type of effect across studies. this is bc the scale used for the predictor and criterion variables will affect the value of b (the regression coefficient) in the linear prediction rule. (the value of a, the regression constant, will also be affected by the scales used.) -we need a type of regression coefficient that can be compared across studies (that may have used different scales for the same variables). --formula for changing a regression coefficient into what is known as a standardized regression coefficient.

the intercept of the regression line

the point at which the regression line crosses the vertical axis (called the Y intercept) -the intercept is the predicted score on the criterion variable (Y hat) when the score on the predictor variable (X) is 0. -the intercept is the SAME as the regression constant. Y hat= a +(b)(X) if X is 0, then whatever the value of b, when you multiply it by X you get 0. Thus, if b multiplied by X comes out to 0, then all that is left is Y= a+0. X is 0, so Y= a.

criterion variable (Y)

the variable BEING predicted. -labeled Y. X predicts Y. ex: college grades

Predictor variable (X)

the variable being predicted FROM. -labeled X. X predicts Y ex: SAT scores

sum of the squared errors

to evaluate how good a prediction rule is, we figure out the sum of the squared errors that we would make using that rule. -the difference between a person's predicted score on the criterion variable and the person's actual score on the criterion variable. -least squares criterion: to come up with a linear prediction rule that creates the smallest sum of squared errors between the actual scores on the criterion variable (Y) and the predicted scores on the criterion variable (Y hat)

linear prediction example

using SAT scores to predict college GPA" -regression constant (a)= .3 -regression coefficient (b)= .004 Y hat= a + (b)(X) = .3 +(.004)(X) Predicted GPA= 3+(.004)(700)= .3+2.80= 3.10


Kaugnay na mga set ng pag-aaral

Raidos psichologijos sąvokos ir teiginiai

View Set

FIN 3310 Final Exam Quizzes Review

View Set

Module 5: Introduction to Computer Hardware

View Set