Section 3: Probability and Random discrete Variables

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

Variance Scale

- Distributions that are compact will have a smaller distance from the mean. ie, smaller value - Distributions that have a further distance from the mean will have a larger value.

Discrete Probability Distributions Pre requisites

- Every numeric value is a probability and will therefore be between 0 and 1. - The sum of the probabilities associated with our random variables have to be 1.

Section 3 Summary: 2 Probability distributions

- Expected values, variances, standard deviations - Laws of expected values - Laws of variances

Variance

- Measures spread/dispersion of distribution - Let X be a discrete random variable with values xi that occur with probability p(xi), and E(X) = μ.

Section 3 Summary: 4 Binomial Distributions

- binomial experiment - bernoulli trial - binomial formula - mean and variance of a binomial variable - binomial tables.

Bivariate Distributions

1. Distribution of a single variable - univariate 2. Distribution of two variables together - bivariate 3. So, if X and Y are discrete random variables, then we say p(x,y) = P(X=x and Y=y) is the joint probability that X=x and Y=y.

Correlation

1. if X and Y have a correlation of 0 it does not mean that they are independent! 2. If 2 variables are independent then they will always have a correlation of 0.

Variance, need to know

1.) Var(X)=σ²(X)=σX² 2.) Variance is always non negative (positive or zero): notice that it is a sum of squared things times probabilities - always non-negative 3.) The positive square root of the variance is the standard deviation.

Compound Events

An event is a collection of one or more simple (individual) outcomes or events. E.g. roll a die; event A = odd number comes up. Then A={1, 3, 5}. In general, use sample space S={E1, E2,..., En} where there are n possible outcomes. Probability of an event Ei occurring on a single trial is written as P(Ei)

Joint Event

An event that has two or more chracteristics

Positive Covariance

As X increases Y Increases

Negative Covariance

As X increases Y decreases

Section 3 Summary: 3 Bivariate Distributions

Bivariate distributions: - Marginal distributions - Independence - Sum of two random variables - Covariance and Correlation - Linear Combinations of Variables

Probabilities of Combined Events (unions)

Consider two events, A and B. P(A or B) = P(A U B) = P(A union with B) = P(A occurs, or B occurs, or both occur) P(A or B)=P(A)+P(B)-P(A and B) = entire shaded area. minus the intersection of B (so that it is not double counted)

The sum of two random variables

Consider two real estate agents. X = the number of houses sold by Albert in a week Y = the number of houses sold by Beatrice in a week Bivariate distribution of X and Y shown on next slide

Covariance and Correlation

Covariance gives you the direction Correlation gives you the strength

Binomial Random Variable Example

Define a random variable, X as the number of successes in n trials Want to be able to calculate P(X=k) for k=0,1,2....,n Start with n=1 P(X=0) = P(failure) = 1-p P(X=1) = P(success) = p

Event

Each possible outcome of a variable

Centre of a Probability Distribution

Expected Value (mean)

A Bernoulli Process example

For example, ask a voter which of the two major parties they are most likely to vote for in an approaching election - Two possible outcomes for each trial, "ALP" or "Lib/Nat" - Probability of "ALP" is p, probability of "Lib/Nat" is (1-p) - Trials are independent - the response of one voter should not affect any other voters' responses

Complement Rule

Given an event A and its complement, Ā, so that A+ Ā=S; Know that P(S)=1; so P(A)+P(Ā)=1; therefore P(Ā)=1-P(A) Interpretation: either A happens, or it doesn't

Rules for Expectations - 5. E(XY)=E(X)*E(Y) only if X and Y are independent

Only holds if variables are Independent either be - told the random variables are independent - prove they are independent NEVER ASSUME

Variance Law - 4. V(X+Y)=V(X)+V(Y) if X and Y are independent

Only works if independent!

Variance Law - 5. V(X-Y)=V(X)+V(Y) if X and Y are independent

Only works if independent!

Probabilities of Combined Events (Intersections)

P(A and B) = P(A ∩ B) = P(A intersection with B) = P(A and B both occur)

Probability of A (come back to)

P(A) = P(A∩B) + P(A ∩Bc) Probability of A happening and B happening or A happens and B doesn't (what is the area of the pink circle? A + the intersection also covered by B)

Marginal Probabilities Cont

P(A1)=P(A1 and B1)+P(A1 and B2)=0.11+0.29=0.40 P(A2)=P(A2 and B1)+P(A2 and B2)=0.06+0.54=0.60 P(B1)=P(B1 and A1)+P(B1 and A2)=0.11+0.06=0.17 P(B2)=P(B2 and A1)+P(B2 and A2)=0.29+0.54=0.83

Probabilities of Combined events (conditional)

P(B|A)=P(B occurs given that A has occurred)

Probability Facts

P(S)=1 - Something has to happen 0≤P(A) ≤1 - All probabilities lie between 0 and 1

Probabilities of Combined Events (complements)

P(Ā)=P(Ac)= P(A complement) = P(A does not occur) (what does this look like?)

Section 3 Summary: Probability Concepts

Probability concepts - rules Random Variables; discrete r.v.s

Multiplication Rule

Read text Chapter 4, Sections 1 and 2 in text; read section 4 on counting rules (need later!)

Frequentists Probabilities

Relative frequency of occurrences can be used to estimate probability.

Probability Rule 1

Rule = the probability of what you are looking for exists between the possibility of it not happening at all (0) and it happening every time (1)

Probability Rule 2

Rule = the sum of all probabilities of events occurring in the sample space is equal to 1 Something always happens

4 Successes in 10 Trials

So, 4 successes can be distributed among 10 trials in how many distinct ways?

Binomial Random Variable

The number of (X) successes in (n) trials - For example, you ask 1500 voters which of the two major parties they plan to vote for in an approaching election - We define a choice of "ALP" as a "success" - Possible values of the random variable are any whole number from 0 to 1500, inclusive.

Rules for Expectations - 1.E(c)=c

The probability of a constant will always equal the constant - The centre of a constant is the constant

Variance Law - 1. V(c)=0

There is no variation on a constant, will always be constant

Mutually Exclusive

Two events are mutually exclusive if they cannot occur at the same time. An example is tossing a coin once, which can result in either heads or tails, but not both. Pr(P(A ∩ B) = 0, the two events do not intersect they have nothing to do with each other If A happens B can't happen

Variance Calculation Formula

Use this if you are given a probability distribution, not n, values for p instead. Do not mix up!

Notation Cont

When, X~Bin(n,p) P(X=k) Means probability that we will have k successes and (n-k) failures We know how many ways we arrange those successes and failures.

Describing Probability Distributions

Where is it centred? Where is it spread?

Binomial Random Variable Example (2)

X = number of successes in 2 trials P(X=0) = P(fail, fail) = P(fail)*P(fail) = (1-p)2(squared) P(X=1) = P(success, fail OR fail, success) = P(SF)+P(FS) =2p(1-p) P(X=2) = P(success,success) = P(success)*P(success) = p2(squared)

Binomial Random Variable Example (4)

X = number of successes in 4 trials P(X=0) = P(f,f,f,f) = P(f)*P(f)* P(f)*P(f) = (1-p)4 P(X=1) = P(sfff OR fsff OR ffsf OR fffs) =4p(1-p)3 P(X=2) = P(ssff OR sfsf OR sffs OR fssf OR ffss OR fsfs) =6p2(1-p)2 and so on.....

Random Variable Notation

X and Y use UPPER CASE

Binomial Random Variable Example (10)

X is the number of successes in 10 trials. P(X=0) = P(ffffffffff) = (1-p)10 P(X=10) = P(ssssssssss) = p10 What about X=4? How do we arrange 4 successes among 10 trials? Think of having 4 balls to split between 10 boxes: - 10 choices for the first ball, then 9 choices for the second ball, 8 choices for the third ball and 7 for the last ball. - But, balls are all identical, so can swap balls around within those with no affect.

Notation

X~Bin(n,p) X has a distribution of Binomial of (trials, probability) X = Binomial Variable n = trials p = probability of success

A Bernoulli Process

a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1. The component Bernoulli variables Xi are identical and independent.

Random Variable

a function that assigns a numeric value to each simple event in a sample space

Sample space

a list of all possible outcomes - must have events that are collectively exhaustive or mutually exclusive e.g. roll a die (1,2,3,4,5,6)

Correlation Coefficient

On a scale

Conditional Probability

= probability of event A given information about the occurrence of other event B

Binomial Experiment (Special Case)

A binomial experiment has the following properties: 1. A fixed number of trials, n 2. Two possible outcomes for each trial, labelled success and failure 3. Probability of success is p, probability of failure is (1-p) 4. Trials are independent - the outcome of one trial does not affect the outcomes of any other trials.

Subjective Probability

A probability derived from an individual's personal judgment about whether a specific outcome is likely to occur. Subjective probabilities contain no formal calculations and only reflect the subject's opinions and past experience.

Discrete Probability Distributions

A table or formula listing all possible values that a discrete r.v. can take, together with the associated probabilities. - have to have numeric value for every possible outcome. - E.g. for our toss three coins example:

Cumulative Binomial Distribution Table - appendix

Can only use the n or p you have or use the table do not add - Note that P(X ≤0)= P(X=0) - P(X ≤k) →use tables - P(X=k) = P(X≤k) - P(X≤(k-1)) - P(X<k) = P(X≤[k-1]) - P(X≥k) = 1- P(X≤[k-1]) - P(X>k) = 1- P(X ≤k)

Mean Calculation with probability

Centre of this distribution is 1.5 - the mean doesn't have to be observable (not 1.5 heads in 3 coin tosses) - do not round off to observable value.

Complement

Complement of event A includes all events that are not part of A

Marginal Probabilites

Computed by adding across rows or down columns - calculated in the margins of the table (general addition rule)

Conditional Probability Equation

Conditional Probability that A occurs, given that B has occured

Covariance

Consider the r.v.s X and Y with joint pdf p(x,y); x=x1,...,xm; y=y1,...,yn. If E(X) = µx and E(Y)= µy, then the covariance between X and Y is given by

Multiplication rule for independent Events

If A and B are independent, the probability of A and B is equal to the probability of A x B

Independence

If A happens I don't change my P of B happening

Binomial distribution, Size of P

If P is large - E(X) will be closer to 1 - middle If P is small - E(X) will be closer to 0 When P is = 0.5 there is a large variance

Variance Laws

If X and Y are r.v.s and c is a constant, 1. V(c)=0 2. V(cX)=c²V(X) 3. V(X+c)=V(X) 4. V(X+Y)=V(X)+V(Y) if X and Y are independent 5. V(X-Y)=V(X)+V(Y) if X and Y are independent

Rules for Expectations

If X and Y are random variables, and c is any constant, then the following hold: 1.E(c)=c 2. E(cX)=cE(X) 3. E(X-Y)=E(X)-E(Y) 4. E(X+Y)=E(X)+E(Y) 5. E(XY)=E(X)*E(Y) only if X and Y are independent

Proving Independence of variables

If the random variables X and Y are independent, then P(X=x and Y=y) = P(X=x) . P(Y=y) p(x,y) = px(x) . py(y) In previous example, X and Y are clearly not independent: p(0, 0) = 1/8 px(0) . py(0) = 1/8 * 2/8 = 1/32 p(0, 0) ≠ px(0) . py(0)

Joint probabilities Cont

Joint probabilities: P(B1AND A1)=0.11 P(B1 AND A2)=0.06 P(B2 AND A1)=0.29 P(B2 AND A2)=0.54 AND can = intersection sign U

Joint Probabilities

Let events be known as follows: A1=Fund manager graduated from a top-20 MBA program A2=Fund manager graduated from a not top-20 MBA program B1=Fund outperforms the market B2=Fund does not outperform the market *use venn diagrams

Mean and Variance of a Binomial Distribution

Mean doesn't have to be observable Mean also not an expected value Expected value = infinitely long run average

Variance Law - 3. V(X+c)=V(X)

adding or subtracting a constant doesn't change the spread

Collectively exhaustive

at least one of the events must occur. E.g, coin toss, heads or tails collectively exhaustive, because they encompass the entire range of possible outcomes.

Simple Event

described by a single characteristic

Rules for Expectations - 2. E(cX)=cE(X)

expected value = the expected value of random variable multiplied by the constant - doesn't change anything, - just shifts up (translation relative to the constant)

Discrete Random Variables

has a countable number of possible values - eg, number of heads, number of sales - doesn't have to be finite -must be able to strictly order the values

Continuous Random Variables

has an infinite number of possible values. Number of elements in sample space is infinite as a result of continuous variation. - eg, height and weight - values cannot be strictly ordered (different instrument may result in new values between those already observed)****WHAT****

A Binomial Experiment

occurs when property 1 (a fixed number of trials ) to a bournoulli process.

Empirical probability

probabilities are based on observed data, not on prior knowledge. - Surveys

Joint Probability

probability of an occurrence involving two or more events

Priori probability

probability of an occurrence is based on prior knowledge of the process involved. where each event is equally likely

Simple Probability

probability of occurrence of a simple event

Random experiment

process that results in a number of possible outcomes, none of which can be predicted with certainty. eg. Toss a coin (heads or tails)

Complement of an event

the set of all outcomes not belonging to that event e.g. if A = {1, 3, 5}, then the complement of A = Ac = {2, 4, 6}

Rules for Expectations - 3. E(X-Y)=E(X)-E(Y) and 4. E(X+Y)=E(X)+E(Y)

use individual probabilities to determine value - do not create a new probability or a new random variable - use previous knowledge of X and Y

Variance Law - 2. V(cX)=c²V(X)

variance of a constant times x = constant squared x variance of X. (variance is in squared units)

Actual Realised Values

x and y use lower case


Kaugnay na mga set ng pag-aaral

Microbiology Dynamic Study Module 11 Group 1 (Ch 11 Section 11.2)

View Set

IT195 Customer Service Skills for the Service Desk Professional - Chapter 3 - NO TRUE/FALSE

View Set

What Was the Wild West questions

View Set

1610 - Jamestown: John Rolfe (Pocahontas & John Smith)

View Set

NCLEX Musculoskeletal & Neurological disorders

View Set

Chapter 15 & 16 - Axial & Appendicular Muscles

View Set

MedSurg II - Exam 3 - Ch. 57 Burns, Ch. 28, 29, 30

View Set

1. Inductive 2. Deductive 3. Reasoning 4. Logic 5. Appeal 6. Evidence 7. Facts 8. Argument 9. Persuasion 10. Thesis 11. Generalization 12. Conclusion 13. Claim 14. Hypothesis 15. Ethos 16. Rebuttal 17. Essay

View Set

Quiz 4 Oxygen Therapy and Respiratory Care Chapter 87

View Set

Making Electric Current with Magnets

View Set