Probability + Statistics
memoryless property
if you know the last m outcomes were failures, doesnt change probability next n are successes
poisson distribution
limiting case for binomial distribution, lots of trials n*p=rate parametet P(x)= e^-x*(np)^x/x!, x=0,1,2,...; 0,else mx(t)=e^(np(e^t-1))
binomial distribution
multiple identical bernoulli experiments X= number of successes P(x)= (n x)p^x(1-p)^(n-p),x=0,1,2...; 0,else
moment generating function
mx(t) = E(e^tx) d^k/dt^k mx(t) ]t=0 =E(x^k)
factorial and combinatorial
Does order matter? Y N Repeats Y n^k (n+k-1 k-1) allowed? N n!/(n-k)! (n k)
variance of x
E (x^2)-E(x)^2 = standard deviation ^2
independent
If A and B are independent then .. P(A|B) = P(A) P(AB)=P(A)*P(B)
mutually exclusive
If A and B are mutually exclusive they cannot occur at the same time AB =0
Bayes rule
If sample space is partitioned into multiple events, and you want to cal. the prob. of an event given A P(A|B) = P(B|A)*P(A) / P(B|A)*P(A) + P(B|Ac)*P(Ac)
generalized counting principle
If there are E1 to Ek sets with n1 to nk elements, there are n1*...*nk elements If set E has n elements and set F has m elements, there are nm ways to choose an element of E then F
conditional probability
Reduce sample space by knowing B has occured P(A|B) = P(AB)/P(B) P(AB) = P(A|B)*P(B)
bernoulli distribution
2 possible outcomes P(x)= 1-p,x=0; p,x=1; 0,else
subset
A is a subset of B if whenever A occurs B also occurs
laws of sets
AB=BA A(BC) = (AB)C AuB=BuA Au(BuC)=(AuB)uC (AB)uC=(AuC)(BuC) (AuB)C=(AC)u(BC) (AB)c=AcuBc (AuB)c=AcBc
complement
An event is a complement of A if it occurs whenever A does not, Ac
difference
An event is the difference of A and B if it occurs when A does and B does not, A-B = ABc
intersection
An event is the intersection of A and B if it occurs only whenever both occur, AnB=AB
union
An event is the union of A and B if it occurs whenever at least one occurs, AuB
When some objects are the same and order matters
Number of distinguishable permutations of n objects of k different types, where n1,n2,... are alike N! / (n1!*n2!*n3!*..)
chebyshebs inequality
P( abs(x-mu) >= t) <= (o/t)^2 =var/t^2
axioms
P(A)>=0 P(S)=1 If A1,...,An are mutually exclusive events, P(UAi)=sum P(Ai) : probability of union is sum of individual probabilities
theorems
P(UAi)=sum P(Ai) P(A)=1-P(A) if A is a subset of B, P(B-A)=P(BAc)=P(B)-P(A) P(AuB)=P(A)+P(B)-P(AB) P(A)=P(AB)+P(ABc)
interarrival times
P(x1>t)=e^(-lt) P(x1<=t)=1-e^(-lt)
markovs inequality
P(x>=t) <= E(x)/t
culumative distribution function
shows probability for each less than or equal to X nondecreasing. limit of 1
probablility distribution function
shows probablility for each possible value of X sums to 1
partitions
subsets that are mutually exclusive union of partitions fills the sample space P(B)= sum P(B*part) : if you know intersections P(B)= sum P(B|part)*P(part) : if you know conditional probability and partitions
expected value of x^n
sum (x^n)*p (x)
exponential distribution
time between/until successes P(x)=l*e^(-lx),x>=0; 0,else mx(t)= l/(l-t) E(x)=1/l Var(x)= 1/l^2
geometric distribution
trial at which 1st success occurs P(x)= p (1-p)^(x-1),x=1,2,...; 0,else mx(t)=pe^t/(1-e^t(1-p)) E(x)=1/p Var(x)=(1-p)/p^2 P(x>a)=(1-p)^a memoryless