ABSTRACT

In this chapter we introduce the reader to the concept of Bayesian Estimation, through which an enhanced estimate of a probability distribution, called the Posterior Probability Distribution, will be obtained by “merging” an initial Prior Probability Distribution with additional information, often from data observed or collected empirically from the specific situation being considered, encoded in a so-called Likelihood Function. The process of “merging” these two sources of information is termed Bayesian Estimation, as it evolves from the Bayes’ Rule, which establishes an important relationship between two random variables and their conditional probabilities. A clear understanding of the concepts stated at the end of this chapter will be key during the discussion that will allow us to arrive to the Kalman Filtering algorithm, in Chapter 6.