ABSTRACT

This chapter attempts to offer a rudimentary introduction into the topic of estimation theory. The estimation of the parameter provides a rule to obtain an optimal value of the parameter of interest. Since the observations, by nature, are corrupted by noise, the observations are random variables. Any operation on these random variables will result in a new random variable. The chapter examines ways to address the goodness of the particular estimator in question. It provides a cursory at three types of estimators: the maximum a priori (MAP), maximum likelihood, and Bayes’ estimator. Similarly to the detection approaches, can choose to deal with the a priori probability density or a cost function, or neither one. MAP estimation is an optimization technique that maximizes the a posteriori probability. That is find the most likely value of the parameter α given the observation. The Bayes’ estimator uses an average cost function which will be minimized.