ABSTRACT

A practical methodology is presented in this paper that can determine optimum life-cycle cost policies based on uncertain structural data and stochastic models. More precisely, the presented method moves away from conventional optimization approaches and relies on stochastic control techniques in the form of Partially Observable Markov Decision Processes (POMDPs). POMDP based policies can suggest where, when and what type of inspection and repair should be performed, taking into account the structural state estimation in real time, based on Bayesian principles. Thus, the inherent uncertainties of inspection efforts and/or monitoring systems can be naturally combined with the stochastic models, and unnecessary mathematical assumptions that unavoidably lead to sub-optimality can be avoided. In this work, in order to form the POMDP framework, a stochastic, physically based model is used, resulting in large state spaces that can adequately and sufficiently describe real life problems. Specific examples of finding optimum policies for the maintenance and management of a corroding structure are presented, based on non-stationary POMDPs for both infinite and finite horizon cases, with 332 and 14009 states respectively. Results from both cases are compared and discussed and the capabilities of the method become apparent.