ABSTRACT

This chapter reviews classical parametric inference, introducing the main ideas of parametric inference and discusses maximum likelihood inference in some detail. It discusses the problem of overfitting and regularized maximum likelihood inference. In the regularized maximum-likelihood method, one doesn’t maximize the log-likelihood itself but an objective function that is the sum of the likelihood and a term that penalizes nonsmooth distribution. The chapter also discusses the minimum relative entropy (MRE) method with equality constraints, i.e., the standard MRE problem, and with relaxed inequality constraints, i.e., the relaxed MRE problem. It shows that the dual of the standard MRE problem is a maximum-likelihood problem and the dual of the relaxed MRE problem is a regularized maximum-likelihood problem for a family of exponential distributions. The chapter provides the main ideas behind classical parametric approach and the maximum-likelihood method as the most commonly used method for parameter inference. It discusses the Bayesian method for estimating probabilistic models.