ABSTRACT

This chapter presents Pontryagin Minimum Principle (PMP) and the related topics of dynamic programming and Hamilton-Jacobi-Bellman (HJB) equation. The PMP is the heart of the optimal control theory. The chapter considers a practical limitation on controls and states, and explores how the principle of optimality in the dynamic programming can be used to optimal control systems. It shows that the dynamic programming technique is a computationally intensive method especially with increase in the order and the number of stages of the system. The chapter tries to derive the optimal feedback control of a discrete-time system using the principle of optimality of the dynamic programming. It describes dynamic programming technique as applied to finding optimal control of the continuous-time systems. The chapter also presents an alternate method of obtaining the closed-loop optimal control, using the principle of optimality and the HJB.