ABSTRACT

This chapter presents the stochastic control approaches for planning and decision making. It shows how to calculate the Richard Bellman backup of a particular belief point, together with its supporting-vector. A description of the use of Markov Decision Processes (MDPs) for infrastructure management is provided and the main focus lies on Partially Observable MDPs (POMDPs) where observations do not reveal the true state of the system with certainty. POMDPs provide a flexible and mathematically sound decision making framework for partially observable environments. MDPs are controlled stochastic processes in which a decision-maker is uncertain about the exact effect of executing a certain action. The transition matrices can be based on a wide variety of stochastic processes and stationary and nonstationary environments, infinite and finite horizons, periodic and aperiodic inspection intervals, history dependent actions and actions' duration can be modeled.