ABSTRACT

One of the cornerstones in the fields of probability and statistics is the use of averages. The most important type of average for random variables involves a weighting by the PDF or PMF and is known as expectation. This chapter discusses expectation and how it applies to the analysis of random variables. It is shown that some of the most important descriptors of random variable X are not just its expectation, but also the expectation of products of the random variable such as X2, X3, and so on. These averages are called moments and collectively the complete set of moments for a random variable is equivalent to knowledge of the density function itself. This last idea is manifested through what are known as “generating functions,” which are also discussed in this chapter. Generating functions are (Laplace, Fourier, z) transformations of the density function and thus provide a different but equivalent probabilistic description of a random variable. The name “generating function” comes from the fact that moments are easily derived from these functions by taking derivatives.