Chapter 5

Ace your homework & exams now with Quizwiz!

Relative Frequency Definition

probabilities are based on empirical data

Subjective Definition

probabilities are based on judgment and experience

Probabilities may be defined from one of three perspectives:

1. Classical Definition 2. Relative Frequency Definition 3. Subjective Definition

Goodness of Fit

A better approach that simply visually examining a histogram and summary statistics is to analytically fit the data to the best type of probability distribution. Three statistics measure goodness of fit: Chi-square (need at least 50 data points) Kolmogorov-Smirnov (works well for small samples) Anderson-Darling (puts more weight on the differences between the tails of the distributions)

Continuous Random Variable

A continuous random variable has outcomes over one or more continuous intervals of real numbers. Examples of continuous random variables: weekly change in DJIA daily temperature time between machine failures

Cumulative Distribution Function

A cumulative distribution function, F(x), specifies the probability that the random variable X assumes a value less than or equal to a specified value, x; that is, F(x) = P(X ≤ x)

Discrete Random Variable

A discrete random variable is one for which the number of possible outcomes can be counted. Example: outcomes of dice rolls whether a customer likes or dislikes a product number of hits on a Web site link today

Probability Density Function

A probability density function is a mathematical function that characterizes a continuous random variable

Random Sampling from Probability Distributions

A random number is one that is uniformly distributed between 0 to 1. Excel function: =RAND( )

Random Variables

A random variable is a numerical description of the outcome of an experiment.

Standard Normal Distribution

A standard normal distribution is a normal distribution with a mean of 0 and standard deviation of 1. A standard normal random variable is denoted by Z. The scale along the z-axis represents the number of standard deviations from the mean of zero. The Excel function =NORM.S.DIST(z) finds probabilities for the standard normal distribution.

Sampling from Common Probability Distributions

A value randomly generated from a specified probability distribution is called a random variate. Example: Uniform distribution

Discrete Uniform Distribution

A variation of the uniform distribution is one for which the random variable is restricted to integer values between a and b (also integers); this is called a discrete uniform distribution. Example: roll of a single die. Each of the numbers 1 through 6 have a 1/6 probability of occurrence.

Probabilities Associated with Events

An event is a collection of one or more outcomes from a sample space. Rule 1. The probability of any event is the sum of the probabilities of the outcomes that comprise that event. Complement of an event.-- If A is any event, the complement of A, denoted Ac, consists of all outcomes in the sample space not in A. Rule 2. The probability of the complement of any event A is P(Ac) = 1 - P(A). Union of Events: The union of two events contains all outcomes that belong to either of the two events. If A and B are two events, the probability that some outcome in either A or B (that is, the union of A and B) occurs is denoted as P(A or B). Two events are mutually exclusive if they have no outcomes in common. Rule 3. If events A and B are mutually exclusive, then P(A or B) = P(A) + P(B). Non-Mutually Exclusive Events The notation (A and B) represents the intersection of events A and B - that is, all outcomes belonging to both A and B . Rule 4. If two events A and B are not mutually exclusive, then P(A or B) = P(A)+ P(B) - P(A and B).

Experiment

An experiment is the process that results in an outcome.

Conditional Probability

Conditional probability is the probability of occurrence of one event A, given that another event B is known to be true or has already occurred

probability mass function

For a discrete random variable X, the probability distribution of the discrete outcomes is called a probability mass function and is denoted by a mathematical function, f(x). The symbol xi represents the ith value of the random variable X and f(xi) is the probability. Properties: the probability of each outcome must be between 0 and 1 the sum of all probabilities must add to 1

Evaluating Capital Budgeting Projects

In finance, one way of evaluating capital budgeting projects is to compute a profitability index: PI = PV / I, PV is the present value of future cash flows I is the initial investment

Binomial Distribution

Models n independent replications of a Bernoulli experiment, each with a probability p of success. X represents the number of successes in these n experiments Probability mass function:

Poisson Distribution

Models the number of occurrences in some unit of measure (often time or distance). There is no limit on the number of occurrences. The average number of occurrence per unit is a constant denoted as λ. Probability mass function

Classical Definition

Probabilities can be deduced from theoretical arguments

Probability

Probability is the likelihood that an outcome occurs. Probabilities are expressed as values between 0 and 1.

Expected Value of a Discrete Random Variable

The expected value of a random variable corresponds to the notion of the mean, or average, for a sample. For a discrete random variable X, the expected value, denoted E[X], is the weighted average of all possible outcomes, where the weights are the probabilities:

Outcome

The outcome of an experiment is a result that we observe.

Probability Rules and Formulas

The probability associated with any outcome must be between 0 and 1. The sum of the probabilities over all possible outcomes must be equal to 1.

Marginal Probability

The probability of an event, irrespective of the outcome of the other joint event, is called a marginal probability. Rule 5. If event A is comprised of the outcomes {A1, A2, ..., An} and event B is comprised of the outcomes {B1, B2, ..., Bn}, then P(Ai) = P(Ai and B1) + P(Ai and B2) + ... + P(Ai and Bn)

Joint Probability

The probability of the intersection of two events is called a joint probability.

Sample Space

The sample space is the collection of all possible outcomes of an experiment.

Uniform Distribution

The uniform distribution characterizes a continuous random variable for which all outcomes between a minimum (a) and a maximum (b) are equally likely.

Other Useful Distributions

Triangular Distribution Lognormal Distribution Beta Distribution

Bernoulli Distribution

Two possible outcomes, "success" and "failure," each with a constant probability of occurrence; p is the probability of a success and 1 - p is the probability of a failure Typically, x = 1 represents "success" and x = 0 represents "failure" Probability mass function:

Data Modeling and Distribution Fitting

Using sample data may limit our ability to predict uncertain events that may occur because potential values outside the range of the sample data are not included. A better approach is to identify the underlying probability distribution from which sample data come by "fitting" a theoretical distribution to the data and verifying the goodness of fit statistically. Examine a histogram for clues about the distribution's shape Look at summary statistics such as the mean, median, standard deviation, coefficient of variation, and skewness

Empirical Probability Distribution

We can calculate the relative frequencies from a sample of empirical data to develop a probability distribution. Because this is based on sample data, we usually call this an empirical probability distribution. An empirical probability distribution is an approximation of the probability distribution of the associated random variable, whereas the probability distribution of a random variable, such as one derived from counting arguments, is a theoretical model of the random variable.

Subjective Probability Distribution

We could simply specify a probability distribution using subjective values and expert judgment. This is often done in creating decision models for phenomena for which we have no historical data

Normal Distribution

f(x) is a bell-shaped curve Characterized by 2 parameters: (mean) (standard deviation) Properties Symmetric Mean = Median = Mode Range of X is unbounded Empirical rules apply

probability distribution

is a characterization of the possible values that a random variable may assume along with the probability of assuming these values. We may develop a probability distribution using any one of the three perspectives of probability: classical, relative frequency, and subjective.


Related study sets

1/1 Mine Planning, Design and Development FTB

View Set