Econometrics for Finance

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

What to do with the critical value in the confidence intervall approach?

fittedB - (critical value x SE(fittedB)); fittedB + (critical value x SE(fittedB)) If fitted B lies outside the values reject H0

Gauss-Markov assumptions

Conditions under which OLS is BLUE

Degrees of freedom at t-tests

N-k

Heteroscedasticity

A regression in which the variances in y for the values of x are not equal

T-Test in econometrics

(Fitted B)/(Stand Error of Fitted B)

Steps in the White test

1. Estimate the model, obtaining the residuals (fitted U) 2. Run the auxiliary regression 3. Obtain R^2 and multiply be the number of observations (T). TR∼x^2(m). m is the number of regressors in the auxiliary regression excl. the constant term. 4. If x^2 statistic is greater than the corresponding from the table reject H0 that the disturbances are homoscedastic.

Parameters of Durbin Watson test

1. 0 < DW < 4 2. DW has critical values (du - upper, dl - lower) 3. There is a region of indecisiveness 4. H0: ρ=0 no autocorrelation H1: ρ>0 1st order autocorrelation

Which of the following are plausible approaches to dealing with residual autocorrelation?

1. Add lagged values of the variables to the regression equation 2. Try a model in first differenced form rather than in levels

If OLS is used in the presence of autocorrelation, which of the following will be likely consequences?

1. Hypothesis test coud reach the wrong conclusions 2. Standard errors may inappropriate

Remedies for autocorrelation (ρ is known)

1. Generalized least square GLS if AC(ρ) is known (multiply t-1 with ρ and substract from original equation)

How can you detect heteroscedasticity?

1. Goldfeld-Quandt (GQ) 2. White's general test

How can you detect autocorrelation?

1. Graphical method 2. Durbin Watson test (1 leg) 3. Breusch-Godfrey test (multiple leg)

Suppose that a test statistic has associated with it a p-value of 0.08. Which one of the following statements is true?

1. If the size of the test were exactly 8%, we would be indifferent between rejecting and not rejecting the null hypothesis 2. The null would be rejected if a 10% size of test were used 3. The null would not be rejected if a 1% size of test were used

What causes autocorrelation?

1. Inertia or persistence 2. Specification Bias- Excluded variable (eg quadratic relationship between Xt and Ut) 3. Omitted Variables 4. Cobweb Phenomenon (Farmers plant on last years price)

Parameters of Breusch-Godfrey Test

1. Known as LM Test 2. Detects Nonstochastic regressors (legged) 3. Detects higher-order autogegressive schemes AR(2) 4. H0: ρ1=ρ2=ρr=0 no AC, H1: unequal 0 positive AC

What are te Gauss-Markov assumptions (First 5 have to be true for the estimater to be BLUE)

1. Linear in Parameters 2. Zero Conditional Mean E(Ut)=0 3. Homoskedasticity Var(Ut^2) = σ^2 < ∞ 4. No Autocorrelation Cov(Ut, Us) = 0 5. Error Term and Explanatory Variable are uncorrelated Cov(Ut, xt) = 0 6. Normality of Errors Ut∼N(0, σ^2) 7. No perfect collinearity

Consequences of Heteroscedasticity

1. OLS is still linear and unbiased 2. OLS has no longer the minimum variance 3. Incorrect standard errors for least squeres 4. R^2 is affected 5. Hypothesis and Confidence tests and standard errors are wrong

What are the consequences of autocorrelation?

1. OLS is still unbiased and consistent 2. Hypothesis testing is no long valid 3. OLS will be inefficient and no longer BLUE 4. R^2 will be overestimated and T-statistics be lower

Remedies for autocorrelation (ρ is unknown)

1. Obtain residuals 2. Estimate ρ from regressing the residuals to its lagged terms 3. Transform the original variables as starred variables 4. Run the regression again 5. Continue 2-4 until the estimates are close to 0.001

What are the causes of heteroscedasticity?

1. Presence of outliers 2. Incorrect functional form of the regression 3. Incorrect transformation of data (mixing of scale)

What are the causes for the error term (Y-fittedY)

1. Randomness 2. Omission of explanatory variables 3. Mis-specification of the model 4. Inorrect functional form of the model 5. Measurement error

What are the causes for an error term

1. Randomness of nature 2. Omission of explanatory variables 3. Mis-specification of the model 4. Incorrect functional form of the model 5. Measurement error

What are other names for the dependent variable (y)?

1. Regressand 2. The explained variable

Which of the following are alternative names for the independent variable (usually denoted by x) in linear regression analysis?

1. Regressor 2. The causal variable

Which of the following are plausible approaches to dealing with a model that exhibits heteroscedasticity?

1. Take logarithms of each of the variables 2. Use suitably modified standard errors 3. Use a generalised least squares procedure

What are the two ways to conduct a hypothesis test?

1. Test of significance 2. Confidence interval approach They give the same result

Which are examples of mis-specification of functional forms?

1. Using a linear specification when y scales as a function of the squares of x 2. Using a linear specification when a double-logarithmic model would be more appropriate 3. Modelling y as a function of x when in fact it scales as a function of 1/x

What are the steps for economic model building?

1. Undertand finance theory 2. Derive estimable model 3. Collect data 4. Estimate model (with OLS) 5. Evaluate estimation results (Carry out diagnostic checks) 6. (Satisfactory - Interpret/Unsatisfactory - Reformulate)

Which one of the following is a plausible remedy for near multicollinearity?

1. Use principal components analysis 2. Drop one of the collinear variables 3. Use a longer run of data

Remedies for Heteroscedasticity

1. Weighted Least Squares WLS, dived regression by σ or another estimated term 2. Transforming the variables into logs

Cochrane-Orcutt

A procedure that can try to correct for auto correlation. (C-O)

Correlation coefficient (R)

A statistical index of the relationship between two things (from -1 to +1)

BLUE

Best Linear Unbiased Estimator

Including relevant lagged values of the dependent variable on the right hand side of a regression equation could lead to which one of the following?

Biased but consistent coefficient estimates

If a Durbin Watson statistic takes a value close to zero, what will be the value of the first order autocorrelation coefficient?

Close to plus 1

R squared

Coefficient of determination - accuracy of prediction. How well the independent variable predicts the dependent variable. Can range between 0 and 1. If r squared = 0.8 then 80% of the variability in Y is "explained" by the variability in X

Differences of financial vs economic data

Fin: higher frequency, better quality, includes risk, noisy, non-normal suffering from kurtosis

What are H0 and H1 in the F-Test?

H0 : β1 = β2 = βk = 0 H1 : not equal (if one is not =0 then reject H0)

F-Test formula

In formula k= # of estimated parameters included constant

When do we use F-Tests?

In the presence of multiregression models

Autocorrelation

It refers to the correlation among lagged values of a random variable. Autocorrelation occurs in timeseries studies when the errors associated with a given time period carry over into future time periods.

What will be the properties of the OLS estimator in the presence of multicollinearity?

It will be consistent, unbiased and efficient

What would be then consequences for the OLS estimator if heteroscedasticity is present in a regression model but ignored?

It will be inefficient

Cross sectional data

Many observation at one point in time

Panel data

Many observations on many points in time

OLS

Ordinary Least Square

Suppose that observations are available on the monthly bond prices of 100 companies for 5 years. What type of data are these?

Panel

Fitted value

Predicted value

Why is R2 not always a appropriate in a multivariate regression?

R2 rises always when additional variables are added. In this case R2 adjusted is more meaningful

If the residuals from a regression estimated using a small sample of data are not normally distributed, which one of the following consequences may arise?

Test statistics concerning the parameters will not follow their assumed distributions

Which of the following could be used as a test for autocorrelation up to third order?

The Breusch-Godfrey test

When there are omitted variables in the regression, which are determinants of the dependent variable, then...?

The OLS estimator is biased if the omitted variable is correlated with the included variable

The residual (error term) from a standard regression model is defined as...?

The difference between the actual value, y, and the fitted value, y-hat

Explaint the importance of the error term

The error term picks up all the effects on yt that are not explained by the constant or explanatory variable

Suppose that the Durbin Watson test is applied to a regression containing two explanatory variables plus a constant (e.g. equation 2 above) with 50 data points. The test statistic takes a value of 1.53. What is the appropriate conclusion?

The test result is inconclusive

What is the meaning of heteroscedasticity?

The variance of the errors is not constant

adjusted R squared

This is the R value having taken into consideration the amount of X variables. This is needed because the more X variables you add into the model, the more it will explain however insignificantly so

Near multicollinearity occurs when..?

Two or more explanatory variables are highly correlated with one another

Vt

Vt = (Ut) - (Ut-1)

Suppose that a researcher wishes to test for autocorrelation using an approach based on an auxiliary regression. Which one of the following auxiliary regressions would be most appropriate?

ut = a0 + a1u(t-1) + v


Kaugnay na mga set ng pag-aaral

American History Since 1877 Final Exam Review - Quizzes

View Set

Section 2.5 - Viatical and Life Settlements

View Set

PrepU - Ch. 12 Management of Patients with Oncologic Disorders

View Set

Health and Society Sociology Exam 1

View Set

Astronomy Lunar and Solar Eclipses & Kepler's Laws

View Set

SCM 486 Final Exam - Modules 11-14

View Set