# Basic Theory of Estimation

## Point Estimation

An estimator is a sequence of statistics, where .

An estimator is called

• (weakly) consistent if in probability with respect to .
• strongly consistent if with probability one.
• unbiased if .
• asymptotically unbiased if .
• efficient if it is unbiased and has the smallest variance among all unbiased estimators.

For example, is an unbiased, strongly consistent estimator of the expectation of the underlying distribution, and is an unbiased, strongly consistent estimator of the variance .

There are two well-known methods for calculating an estimator:

1. Method of moments:
Let us suppose that the expectation is a function of the parameter that has a continuous inverse . So we have . If we replace by its estimator , we get .
If there is more than one parameter, use the higher moments and replace them by .
2. Likelihood method:
Simply use the value of that maximizes the likelihood-function as an estimator. This estimator is called maximum likelihood estimator.

The Cramér-Rao theorem provides a lower bound for the variance of an unbiased estimator: Let be a random variable with distribution , where is a real parameter and is supposed to be an interval. Moreover, the density should be twice differentiable with respect to , and both and should be bounded above uniformly in by an integrable function, that means

Furthermore, let be an unbiased estimator of . Then

where

is the so-called Fisher information.

If we have a sample of size , we interpret this as one -dimensional random variable, and so we get:

If is an unbiased estimator and a sufficient statistic, then is also an unbiased estimator and has a variance which is not greater then the one of . This means, that if we look for an efficient estimator, we only need to consider functions of .

Finally, if is an sufficient statistic and has the property

then if is unbiased, this estimator is efficient.