ABSTRACT

In this section, we prove the information inequality. This inequality relates the variance of an arbitrary statistic to Fisher Information. When applied to an estimate of a parameter based on a sample from a distribution, this inequality is known as the Cramér-Rao lower bound. If an unbiased estimate attains the Cramér-Rao bound, it is automatically a best unbiased estimate. We will see that the bound for unbiased estimates may not be achievable, and even if it is achieved, the achieving estimate may not be admissible. In the investigation of the inequality for a vector parameter, we note the effect of not knowing the values of nuisance parameters.