Quantitative Economics
the Chi-Squared Distribution
The Chi-Squared distribution is the distribution of the sum of m squared independent standard normal random variables. The distribution depends on degrees of freedom (DF) m.
The t-distribution (or Student t distribution)
The Student t distribution with m degrees of freedom is defined as the distribution of the ratio of standard normal random variable (Z), divided by the square root of an independently distributed chi-squared random variable (W) with m degrees of freedom.
Effect of size of m on tm distribution
When m ≥ 30 then t distribution is well approximated by the standard normal distribution. When m ≤ 30 it is similar but with more mass in the tails (it is a fatter bell shape)
1st property of multivariate normal distribution
if n random variables have a multivariate normal distribution then any linear combrination of these variables (such as their sum) is normally distributed. For instance if X and Y have a bivariate normal distribution with covariance σxy and a and b are two constants then
If the correlation of X and Y is zero then...
... the conditional mean of Y may still depend on X
If conditional mean of Y does not depend on X then...
... they are uncorrelated
Independence
For X and Y to be independent / independently distributed
Standard distribution
Square root of the variance denoted: σy
Bernoulli random variable
A discrete random variable which can only take the value 1 (with probability p, or 0 with probability 1 - p
Probability distribution of a discrete random variable
A list of all possible values of the variable and the probability that each value will occur. These sum to one
Correlation
Another measure of the extent to which two random variables move together. This is not affected by the units because they cancel and correlation is therefore unitless. If corr ( X, Y ) = 0 then variables are uncorrelated
Expectation of a continuous random variable
DEFINE its in appendix 17.1
F distribution with m and n degrees of freedom
Denoted Fm,n it is the distribution of the ratio of a chi-squared random variable (W) with degrees of freedom m, divided by m, to an independently distributed chi-squared random variable (V) with degrees of freedom n, divided by n.
Degrees of Freedom (DF)
Determinate of a Chi-Squared Distribution. Degrees of freedom is simply the number of possible outcomes, minus one (e.g. flipping a coin has one degree of freedom, rolling a die has five).
rth moment of Y
E (Y^r)
Expected value of a Bernoulli random variable
E(G) = 1 x p + 0 x (1 - p) = p
4th property of multivariate normal distribution
If X and Y have a bivariate normal distribution, then the conditional expectation of Y given X is linear in X. Joint normality implies linearity of conditional expectations (but the reverse is generally not true).
2nd property of multivariate normal distribution
If a set of variables has a multivariate normal distribution, then the marginal distribution of each of the variables is normal.
3rd property of multivariate normal distribution
If variables with a multivariate normal distribution have covariances that equal zero, then the variables are independent. Note: If X and Y are independent, then, regardless of their joint distribution, σXY = 0. The reverse is generally not true and is a special property of the multivariate normal distribution.
Covariance
One measure of the extent to which two random variables move together. The results can be difficult to interpret because they are affected by the units of X and Y. If the two variables are independent then cov( X, Y ) = 0
Conditional Distribution
The distribution of variable Y conditional on another random variable X taking on a specific value
Conditional expectation / mean
The expected value of Y based on the conditional distribution of Y given X
Expected value
The long-run average value of the random variable (Y) over many repeated trials or occurrences. It is denoted E(Y) and is also referred to as the expectation of Y, mean of Y and μy
Implication of the law of iterated expectations
The mean of Y is the expectation of the conditional expectation of Y given X
Mean according to the law of iterated expectations
The mean of Y will equal the weighted average of the conditional expectation of Y given X, weighted by the probability distribution of X.
The joint probability distribution
The probability that two random variables (X and Y)
The Marginal probability distribution
The same as the probability distribution of variable Y given that X can take on different values x1, x2 ... , xl . Used to differentiate between from the joint probability function
If n is large enough such that Fm,n can be approximated by Fm,∞
Then the F distribution Fm,∞ is the distribution of the chi-squared random variable with m degrees of freedom divided by m
Skewness
This describes how much a distribution deviates from symmetry. This is the third moment of the distribution
Kurtosis
This is a measure of how much mass in in the 'tails' of a distribution and therefore how much extreme values of Y are affecting the variance. The greater the Kurtosis, the more likely are outliers. For a normal distribution, the Kurtosis is 3. Distibutions with values greater than 3 are called leptokurtic
Probability distribution of continuous random variables
This is the probability that the random variable falls between two points. This is calculated by taking the area underneath the probability density function between those two points.
mean of the sum and the variance of the sum of two random variables
equals the sum of their means and sum of their variances plus twice their covariance respectively
Standard normal distribution
normal distribution with μ = 0 and σ^2 = 1
Variance of Bernoulli Random Variable
p(1-p)
cumulative probability distribution of a discrete and continuous random variable
the probability that the random variable is less than or equal to a particular value
Conditional variance
the variance of conditional distribution of Y given X