ABSTRACT

This chapter reviews the basic terminology and notation involved with the presentation of materials about stochastic processes. It presents the definition of a stochastic process and the various types of stochastic processes. The chapter focuses on defining the various types of states of a stochastic process, and discusses several examples of discrete and continuous Markov processes. It devotes to the explanation of several types of normal processes. The chapter presents the fundamental ideas that describe a Markov chain, a group of very important stochastic processes which have been shown to be relevant to many problems in science. It also presents a brief review of stochastic processes and gives the reader the necessary material in order to understand the background essential for making Bayesian inferences for the unknown parameters of a stochastic process. The chapter concludes with explanations of the stationary and evolutionary processes and the introduction of stochastic calculus, including the integral and derivative of a stochastic process.