ABSTRACT

This chapter focuses on mixture and classification models and on asymptotic properties of their likelihood estimators. Section 1.1 covers convergence theorems general enough to be applicable to various nonparametric and parametric statistical models, not only to mixtures. The nonparametric framework for consistency has the advantage of needing no parameter space and thus no identifiability. Many models possess a natural parameter space; often it is Euclidean. We are then also interested in convergence w.r.t. to the topology of this space. This leads to the parametric theory. Asymptotic normality needs even differentiable models.