ABSTRACT

Estimation of the unknown parameters of distributions from the data is one of the key issues in statistics. This chapter presents several methods of estimation and then defines and discusses their goodness. Maximum likelihood estimation is probably the most used method of estimation. Its underlying idea is simple and intuitively clear. Another popular method of estimation is the method of moments. Its main idea is based on expressing the population moments of the distribution of data in terms of its unknown parameter(s) and equating them to their corresponding sample moments. The parameter(s) are then estimated by the solutions of the resulting equations. The method of least squares plays a key role in regression and analysis of variance. The Cramer–Rao lower bound allows one only to evaluate the goodness of a proposed unbiased estimator but does not provide any constructive way to derive it.