ABSTRACT

This chapter discusses several types of convergence (consistency) of estimators. A more realistic approach is then to relax the requirement of consistency in MSE by the weaker consistency in probability as a minimal asymptotic requirement for a “good” estimator. The chapter recalls first a general definition of convergence in probability. It addresses the original question on the asymptotic distribution of estimators. In particular, the chapter shows why in many situations their limiting distribution is normal. Asymptotic normality of an estimator can be used to construct an approximate normal confidence interval when its exact distribution is complicated or even not completely known. Asymptotically normal consistency can also be used for deriving asymptotic tests in hypothesis testing. Thus, exploiting the duality between hypothesis testing and confidence intervals.