Stat Chapter 10-1 Concepts of Estimation
Knowing that an estimator is unbiased only assures us that (A) It does not tell us (B)
(A) its expected value equals the parameter (B) how close the estimator is to the parameter
The sample median is an __(A)___ estimator but that its __(B)__ is greater that of the sample mean (when population is normal)
(A) unbiased (B) variance
Why is P-hat considered a consistent estimator of p?
B/c it is unbiased and the variance of P-hat is (the formula), which grows smaller as n grows later
What are two unbiased estimators that we already know? and why are they unbiased?
(1) The sample mean (X-Bar) is an unbiased estimator of the population mean (Mu) - This is b/c E(X-Bar) = Mu (2) The sample proportion is an unbiased estimator of the population proportion - This is b/c E(P-hat) = p (mean)
As a consequence, why do we use the second method of estimating a (A), the interval estimator? (B)
(A) population parameter (B) B/c of the three drawbacks to using point estimators, most importantly that point estimators dont have the capacity to reflect the effects of larger sample sizes
We can use (A) to estimate a population parameter in two ways. The two ways are (B) [10-1a]
(A) sample data (B) Point and Interval estimator
What are the three drawbacks to using point estimators?
(1) it is virtually certain that the estimate will be wrong (2) we often need to know how close the estimator is to the parameter (3) in drawing inferences about a population, it is intuitively reasonable to expect that a large sample will produce more accurate results. - BUT, point estimators dont have the capacity to reflect the effects of larger sample sizes
The __(A)__ is relatively more efficient than the __(B)__ when estimating the population mean.
(A) Sample mean (B) sample median
What are the three desirable qualities of an estimator?
Unbiasedness, consistency, and relative efficiency
What is an consistent estimator that we already know? Why?
X-Bar is a consistent estimator of Mu b/c the variance of X-bar is Variance/n - This implies that as n grows larger, the variance of X-Bar grows smaller
An unbiased estimator of a population parameter is
an estimator whose expected value (long-run average) is equal to that parameter
Basically, under point estimator we can
compute the value of the estimator and consider that value as the estimate of the parameter
The objective of estimation is to [10-1] For example
determine the approximate value of a population parameter on the basis of a sample statistic Ex) the sample mean is employed to estimate the population mean
In this chapter (chapter 10), we learn how to
estimate a population mean using sample data
A point estimator draws inferences about a population by
estimating the value of an unknown parameter using a single value or point
An interval estimator draws inferences about a population by
estimating the value of an unknown parameter using an interval.
The unbiased estimator basically means that
on average, the sample statistic is equal to the parameter
THe interval estimator is affected by the
sample size
when presenting the statistical inference of a number of different population parameters, we select a (A) What happens if there is more than one such statistic?
sample statistic that is unbiased and consistent (B) we will choose the one that is relatively efficient to serve as the estimator
The selection of the sample statistic to be used as an estimator, however, depends on
the characteristics of that statistic
An unbiased estimator is said to be consistent if
the difference between the estimator and the parameter grows smaller as the sample size grows larger.
Once the sample mean has been computed, its value is called
the estimate
We refer to the sample mean as
the estimator of the population mean
The measure we use to gauge closeness is
the variance (or the standard deviation)
Relative efficiency compares
two unbiased estimators of a parameter