An estimator is a sequence of statistics, where .
An estimator is called
For example, is an unbiased, strongly consistent estimator of the expectation of the underlying distribution, and is an unbiased, strongly consistent estimator of the variance .
There are two well-known methods for calculating an estimator:
The Cramér-Rao theorem provides a lower bound for the variance of an unbiased estimator: Let be a random variable with distribution , where is a real parameter and is supposed to be an interval. Moreover, the density should be twice differentiable with respect to , and both and should be bounded above uniformly in by an integrable function, that means
Furthermore, let be an unbiased estimator of . Then
is the so-called Fisher information.
If we have a sample of size , we interpret this as one -dimensional random variable, and so we get:
If is an unbiased estimator and a sufficient statistic, then is also an unbiased estimator and has a variance which is not greater then the one of . This means, that if we look for an efficient estimator, we only need to consider functions of .
Finally, if is an sufficient statistic and has the property
then if is unbiased, this estimator is efficient.