Math 120 - Chapter 5 - Probability

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

3 Methods for Determining the Probability of an Event [5.1]

1) the empirical method; 2) the classical method; 3) the subjective method

2 Rules of Probability [5.1]

1) the probability of any event E, called P(E), must be greater than or equal to zero and less than or equal to 1 (i.e., 0 ≤ P(E) ≤ 1) 2) the sum the probabilities of all outcomes must equal 1; (i.e., for a sample space S = { e₁, e₂, ..., eₙ }, then ∑ P(eᵢ) = 1)

Tree Diagram [5.1]

a diagram used to determine the sample space of a probability experiment; to generate one: 1) start with a node for the first choice; 2) from that node draw an arrow for each possible value of that choice; 3) if there is another choice to be represented, add a node at the end of each arrow drawn, then repeat step 2 for each of those nodes; 4) if there are no further choices to be represented, draw an outcome node containing all choices made to reach that outcome [example: the image shows the tree diagram for determining the sample space for the sexes of the kids in a family with 3 kids]

Fair Die [5.1]

a dice in which each possible outcome is equally likely (i.e., rolling a 2 is just a likely as rolling a 5)

Probability [5.1]

a measure of the likelihood of a random phenomenon or chance behavior occurring; the long-term proportion in which a certain outcome is observed is the probability of that outcome [example: the probability of an outcome of heads when flipping a coin is 1/2 or 50%]

Subjective Method (of Obtaining Probabilities) [5.1]

a subjective probability is based on personal judgement, i.e., a prediction made by someone based on their knowledge; [example: a prediction by an economist that there is a 20% chance we will enter a recession in the next year; this isn't something based on relative frequency, nor can we conduct and experiment n times to obtain the frequency]

Simulation [5.1]

a technique used to re-create a random event; simulations can be tactile (e.g., physically flipping a coin), or virtual (having a computer pretend to flip a coin); the goal of simulation is to measure how often a certain outcome is observed

Certainty Events [5.1]

an event is a certainty if the probability of the event is 1

Impossible Events [5.1]

an event is impossible if the probability of event is 0

Unusual Events [5.1]

an event that has a low probability of occurring; usually used for any event with a probability under 5%; the researcher must determine the cutoff point for which events are considered unusual

Equally Likely Outcomes [5.1]

an experiment has equally likely outcomes when each outcome has the same probability of occurring; this is required in order to use the classical method of determining probabilities [example: when a fair die is thrown, each of the six sides has the same probability of being rolled]

Event (usually denoted by capital letters, such as E) [5.1]

any collection of outcomes from a probability experiment; it consists of one outcome or more than one outcome; events with one outcome are called simple events, eᵢ [example: in a probability experiment consisting of rolling a fair die, one event could be: E = "roll an event number"]

Probability Experiment [5.1]

any process with uncertain results that can be repeated; the result of a single trial is unknown ahead of time; however, the results over many trials produce regular patterns that allow accurate predictions

Law of Large Numbers [5.1]

as the number of repetitions of a probability experiment increases, the proportion with which a certain outcome is observed gets closer to the probability of that outcome [example: the image shows a graph of two experiments with 100 coin flips each, the graphs show the proportion moving closet to 50% as the number of flips increases]

Complement Rule [5.2]

because E and Eᶜ are disjoint, P(E or Eᶜ) = P(E) + P(Eᶜ) = P(S) = 1; simplifying we get the rule: P(Eᶜ) = 1 - P(E); the image shows a Venn Diagram illustrating the rule

Empirical Method (of Approximating Probabilities) [5.1]

calculating probabilities based on empirical data such as the outcomes of a probability experiment, using the idea of relative frequency; the probability of an event, P(E), is the relative frequency of E = (frequency of E / number of trials of the experiment)

Disjoint Events vs. Independent Events [5.3]

disjoint events and independent events are different concepts; but if events are disjoint then knowing one of the events occurs means also knowing the other event did not occur, which means these events are not independent [example: take a single die roll, for the events "roll and even number", P(E), and "roll an odd number", P(O), both P(E) and P(O) = 1/2; however if we know a roll is going to be event, then what is the probability of P(O)? P(O) becomes zero, so these events are not independent]

Complement of an Event (symbol is Eᶜ) [5.2]

for an event E, the compliment, Eᶜ, is defined as all outcomes in the sample space, S, which are not in E

Multiplication Rule for Independent Events [5.3]

for two independent events, E and F, the probability of E and F equals the probability of E times the probability of F: P(E and F) = P(E) * P(F) this rule can be extended for any number of independent events E₁, E₂, ..., Eₙ: P(E₁ and E₂ and ... and Eₙ) = P(E₁) * P(E₂) * ... * P(Eₙ)

General Addition Rule [5.2]

formula used when you need to combine the probabilities of two events that are not disjoint (i.e., which share at least some common outcomes); for any two events E and F, the probability of E or F equals the probability of E plus the probability of F minus the probability of E and F (i.e., P(E or F) = P(E) + P(F) - P(E and F)) [example: given the sample space of the cards in a deck, the probability P(K or D) of drawing a King, P(K), or a Diamond, P(D), would be calculated as: P(K or D) = P(K) + P(D) - P(K and D); this would P(K or D) = 4 kings/52 + 13 diamonds/52 - 1 king of diamonds/52 = 16/52 = 4/13]

Addition Rule for Disjoint Events [5.2]

if E and F are disjoint (or mutually exclusive) events, then the probability of E or F equals the probability of E plus the probability of F (i.e., P(E or F) = P(E) + P(F)); this rule can be extended for any number of events as long as they all are disjoint

Classical Method (of Computing Probabilities) [5.1]

if an experiment has n equally likely outcomes and if the number of ways that an event E can occur is m, then the probability of E, called P(E), equals (m / n) [example: when rolling 2 dice, there are 36 possible results, n, (i.e., 1+1, 1+2, ..., 1+6, 2+1, 2+2, ..., 6+6), and 6 of those results give a total of 7, m, so the probability of getting a 7 is 6/36 = 1/6 ≅ 1.67]

Small Random Samples from a Large Population [5.4]

if small random samples are taken from a large population without replacement, it is reasonable to assume independence of the events; as a rule of thumb, if the sample size is less than 5% of the population size, we treat the events as independent; this is because for small samples (relative to the population), the math for the General Multiplication Rule and the Multiplication Rule for Independent Events ends up being very close in value

Probability Model [5.1]

lists the possible outcomes of a probability experiment and each outcome's probability; it must satisfy the 2 rules of probability

Law of Averages is not a Real Law [5.1]

people often misinterpret the law of large numbers calling it the law of averages with the idea being that the more times you flip the coin and keep getting tails, the more likely (higher probability) that the next flip will be heads; this is not true, the next flip is independent of what happened before

Random [5.1]

random suggest an unpredictable result or outcome

Random Process [5.1]

represents a scenario where the outcome of any particular trial of an experiment is unknown, but the proportion (or relative frequency) a particular outcome is observed approaches a specific value as the number of trials increase [example: flipping a coin over and over, as the number of flips goes up, the proportion of heads seen to number of flips will approach 50% (i.e., 0.5); the image shows a graph of two simulations with 100 coin flips each, the graphs are representative of a random process]

Comparing Empirical and Classical Methods [5.1]

the classical method will determine the actual probabilities; the empirical method values may be somewhat different, but as the number of repetitions/observations increases the empirical probability should approach the classical probability

Sample Space of a Probability Experiment (symbol is S) [5.1]

the collection of all possible outcomes

Conditional Probability [5.4]

the notation (P(F | E) is read "the probability of event F given event E"; it is the probability that event F occurs given that the event E has occurred [example: when rolling one die P(3 | outcome is odd) = 1/3]

At-Least Probabilities [5.3]

the phrase at least means "greater than or equal to": usually probabilities using the phrase at least use the Complement Rule [example: compute the probability that at least one male out of 1000 age 24 will die during the course of the year if the probability that a randomly selected 24 year old male survives the year is 0.9986; what is being asked could be formulated as P(1 dies or 2 dies or 3 dies or ... 1000 dies), but calculating all those probabilities would be time consuming; instead we can formulate it as the complement on "no one dies", P(0), so: 1 - P(0); each man selected is an independent event (each one lives or dies and that has no impact on the other men selected); given the probability that a randomly selected man lives, (P(n) = 0.9986), therefore, the probability no one dies, P(1 and 2 and ... and 1000) is 0.9986^1000; talking the complement of that, 1 - (0.9986^1000) gives you the probability that at least one man dies

Conditional Probability Rule [5.4]

the probability of event F occurring given the occurrence of event E, is found by dividing the probability of (E and F) by the probability of E, which equals the number of outcomes in (E and F) divided by the number of outcomes in E P(F | E) = P(E and F)/P(E) = N(E and F)/N(E)

General Multiplication Rule [5.4]

the probability that two events E and F both occur is: the probability of E * (the probability of F given the occurrence of E) P(E and F) = P(E) * P(F | E)

Disjoint Events (aka Mutually Exclusive Events) [5.2]

two events are disjoint (or mutually exclusive) if they have no outcomes in common; a Venn Diagram in which the rectangle represents the sample space, and 2 circles representing events E and F, if E and F are disjoint events then circles will not overlap or touch

Non-Disjoint Events [5.2]

two events are no disjoint (or mutually exclusive) if they share one or more outcomes; a Venn Diagram in which the rectangle represents the sample space, and 2 circles representing events E and F, if the circles touch at a point or overlap, then E and F are not disjoint events

Independence Expressed Using Conditional Probabilities [5.4]

two events, E and F are independent if P(E | F) = P(E), or, equivalently, if P(F | E) = P(F)

Independent vs Dependent Events [5.3]

two events, E and F, are independent if the occurrence of event E in a probability experiment does not affect the probability of event F; they are dependent if the occurrence of E affects the probability of event F [example 1: if the experiment is flipping a coin, does seeing a heads on the first flip affect the probability of seeing a tails on the second flip? no, so these are independent] [example 2: are the events "earned a bachelor's degree" and "earn more than $100,000" independent? no, they are dependent because earning a bachelor's degree affects the likelihood that an individual will earn over $100,000]

What is Probability [5.0]

we can think of probability of an outcome as the likelihood of observing that outcome; if something has a high likelihood of happening it also has a high probability (close to 1, or 100%); if something has a low likelihood of happening it also has a low probability (close to 0, or 0%)

Benford's Law [5.2]

when examining a set of data consisting of numbers, the probability that the first (most significant) digit is a particular digit is defined in the table shown in the attached image


Kaugnay na mga set ng pag-aaral

Early Childhood Education Medical Test Study

View Set

Chapter 10. Standard Costs and Variances

View Set

NCTI Installer Tech Multiple Choice Part 1

View Set

GEO 1000: Science in Cinema EXAM I - HW Sheets

View Set