ABSTRACT

This chapter focuses on furthering the development of Markov decision process (MDP)-based system-level maintenance, rehabilitation and replacement (MR&R) decision-making frameworks in the context of transportation infrastructure management, and in particular pavement management systems. The MDP-based approaches in the infrastructure management literature can be broadly categorized as either topdown or bottom-up. The development of other MDP-based optimization frameworks for infrastructure management has focused on providing facility-specific policies to decision-makers. The linear programming formulation provides an optimal as well as a computationally attractive framework for solving the constrained MDP problem. The aggregation of policies allows for budget constraints to be imposed on all future actions, while maintaining the Markovian evolution of the state of the system. The contribution of simultaneous network optimization lies in facilitating a comparison between the top-down and bottom-up methodologies in MDP-based MR&R decision-making frameworks. Incorporating economies of scale and accounting for the impact of traffic disruptions are important issues which should to be represented within system-level MR&R decision-making.