Exam 1

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

Linear function of random varibales

1. Let random variable X have mean and variance 2. Let a and b be any constants 3. Let Y=a + bX Mean (a+bX) =a+bμx Variance (a+bX) = b^2σ^2

Density vs Cumulative distribution

1. The total area under the density curve f(x) is 1 2. The area under the density f(x) to the left of ais the cumulative distribution function F(a), where ais any value that the random variable can take.

mutually exclusive events

A and B are mutually exclusive events if they have no outcomes in common

Standardized random variable

A random variable transformed by subtracting off its expected value and dividing the result by its standard deviation; the new random variable has mean zero and standard deviation one

Classical Probability

Assumes all outcomes in the sample space are equally likely to occur P(A)=Na/n

Cumulative probability function

Denoted F(x0) shows the probability that X is less than or equal to (i.e., no greater than) some value x0.

Linear function of random variables

Let W=a+bX where X has a mean μX and variance σX2. Standard deviation of W is σw = lblσx

inferential statistics

Making statements about a population by examining sample results

Collectively Exhaustive Events

One of the events must occur The set of events covers the entire sample space

Discrete random variable characterized

Probability distribution functions Cumulative distribution functions Moments (mean, standard deviation, correlation)

Discrete random variable

Takes on a countable number of values

cumulative distribution function

The cumulative distribution function F(a) is the area under the probability density function f(x) from the minimum x value up to a -infinity=Xm u=x where xm is the minimum value of the random variable x

Cumulative distribution function

The cumulative distribution function, F(x), for a continuous random variable X is similar to the discrete case: This function is non-decreasing and takes on values between 0 and 1

Market return, Z

The market return, Z, for the portfolio with fraction f invested in X and 1-f in Y is given by: Z= fX+(1-f)Y

Combinations

The number of combinations of x objects chosen from n is the number of possible selections that can be made . n!/[x!(n-x)!]

Sample mean

The sample mean is unbiased E(Xbar)=μ

marginal probability

the values in the margins of a joint probability table that provide the probabilities of each event separately

Bernoulli Distribution

1. Consider only two outcomes: "success" or "failure" 2. Let P denote the probability of success 3. Let 1-P be the probability of failure 4. Define random variable X; X=1 if success x=0 is failure P(0)= 1-P P(1)=P

A continuous random variable is a variable that can assume any value in an interval

1. Driving distancen 2. Time required to complete a task 3. Height, in inches

Why Sample?

1. Less time consuming than a census 2. Less costly to administer than a census 3. It is possible to obtain statistical results of a sufficiently high precision based on samples

Binomial Distribution key assumptions

1. Two mutually exclusive and collectively exhaustive categories. These are generally called "success" and "failure", with the probability of a success equal to P. 2. Constant probability for each observation (e.g., the probability of getting an answer correct is the same across all questions). 3. Observations are independent (i.e., the outcome of one observation does not affect the outcome of the other).

Bivariate Probabilities

1. two distinct sets of events A1.....Ak and B1.... Bj 2. Within their sets the events are mutually exclusive (no overlap) and exhaustive (cover the entire sample space) Special case: two-by-two: events A1 and A2 are mutually exclusive and exhaustive and likewise for B1 and BBayes

A population vs a sample

A Population is the set of all items or individuals of interest vs. A Sample is a subset of the population

joint probability distribution

A joint probability distribution is used to express the probability that simultaneously X takes the specific value x and Y takes the value y, as a function of x and y

Standard Error of the Mean

A measure of the variability in the mean from sample to sample is given by the Standard Error of the Mean

Random experiment

A process leading to an uncertain outcome

Event (E)

Any subset of outcomes from the sample space

correlation between x and y is

COV(X,Y)/ 1 ≤ ρ ≤ 1 ■ρ = 0: no (linear) relationship between X and Y ■ρ > 0: positive (linear) relationship between X and Y ■when X is high (low) then Y is likely to be high (low) ■ρ = +1: perfect positive (linear) dependency ■ρ < 0: negative (linear) relationship between X and Y ■when X is high (low) then Y is likely to be low (high) ■ρ = -1: perfect negative (linear) dependency

Continuous random variable

Can take on any numerical value in an interval. Possible values are measured on a continuum

Central Limit Theorem

Even if the population is not normal, sample means from the population will be approximately normal as long as the sample size is large enough.

Statistical independence

Events A and B are independent when the probability of one event is not affected by the other event. The intersection of P(A and B)= P(A)P(B)

Intersection of event

If A and B are two events in a sample space S, then the intersection, A ∩ B, is the set of all outcomes in S that belong to both A and B

Union of events

If A and B are two events in a sample space S, then the union, A U B, is the set of all outcomes in S that belong to either A or B

Probability Distributions for Discrete Random Variables

Let X be a discrete random variable and x be one of its possible values ▪The probability that random variable X takes specific value x is denoted P(X = x) ▪The probability distribution function of a random variable is a representation of the probabilities for all the possible outcomes. ▪0≤P(X = x)≤1 for all x ▪Individual probabilities must sum to one.

the binomial distribution

P(x)= probability of x successes in n trials, with probability of success P on each trial x = number of 'successes' in sample (x = 0, 1, 2, ..., n) n = sample size (number of independent trials or observations) P = probability of "success"

The addition rule

The Addition rule: the probability of the union of two events equals:

The complement of an event

The Complement of an event A is the set of all outcomes in the sample space that do not belong to A. The complement is denoted A

Conditional probability distribution

The conditional probability distribution of the random variable Y expresses the probability that Y takes the value y when the value x is specified for X.

covariance

The expected value of (X - μX)(Y - μY) is called the covariance between X and Y. E(XY)- μxμy

independent

The jointly distributed random variables X and Y are said to be independent if and only if their joint probability distribution is the product of their marginal probability functions. For all possible pairs of values x and y. P(x,y)= P(x)P(y)

permutations

The number of possible arrangements when x objects are to be selected from a total of n objects and arranged in order (with (n - x) objects left over) n!/(n-x)!

Bayes' Theorem

The probability of an event occurring based upon other event probabilities. Suppose that we know Pr(B1 l A1), Pr(B1 l A2), Pr(A1). Using these, how do we compute Pr(A1 l B1)

The uniform distribution

The uniform distribution is a probability distribution that has equal probabilities for all equal-width intervals within the range of the random variable Total area under the uniform probability density function is 1.0

Properties of good estimators

Unbiased: on average, the sample statistics should equal the population parametern Precise: all else equal, estimators should have low variance. This minimizes any estimation error.

Derived relationship between probability distribution and the cumulative probability distribution

consider the relationship between the probability distribution and the cumulative probability distribution. Let X be a random variable with probability distribution P(x) and cumulative probability distribution F(x0). Then

Central Limit Theorem (CLT)

et X1, X2, . . . , Xn be a set of n independent random variables having identical distributions with mean μ, variance σ2, and X as the mean of these random variables. As n becomes large, the central limit theorem states that the distribution of Z approaches the standard normal distribution

Probability density function

f(x) of random variable X has the follwoing properties: 1. f(x)>0 for all values of x in its range 2. The are under the probability density function f(x) over all values of the random variable x within its range, is equal to 1.0 3. The probability that X lies between two values is the area under the density function between the two values: P(a<x<b)= integral a to b f(x)dx

The probability of a joint event

number of outcomes satisfying A1 and B1/total number of outcomes.

Random Variable

represents a possible numerical value from random experiment

Z-Value for Sampling Distribution of the Mean

sample mean=x population mean= μ

Probability

the chance that an uncertain event will occur (always between 0 and 1) 0<=P(A)<=1

Sample Space (S)

the collection of all possible outcomes of a random experiment

Variance of uniform distribution

where a=minimum value of x b= maximum value of x

continuous uniform distribution

where f(x)= value of the density function at any x value a= minimum value of x b= maximum value of x

Conditional Probability

is the probability of one event give that another event has occurred. The conditional probability of A given that B has occured


संबंधित स्टडी सेट्स

BIO 101 (Chapter 4 Energy & Metabolism) 9/12/17 TEST 2

View Set

chapter 18-the cardiovascular system: the heart

View Set

Map Reading and Land Navigation TC 3-25.26

View Set

Capstone Maternal OB focused review/pathology

View Set