ABSTRACT

This chapter provides an insight into the optimal control branch of engineering. The need for optimal control in modern control theory is explained by comparing optimal control with classical control theory. The basic definitions including: Function, maximum and minima of a function, functional, increment and variation of functional and the performance indices related to optimal problems are initially explained. This chapter covers the topics related to time-varying cases. Both finite time and infinite time problems are discussed. The design of Linear Quadratic Regulator (LQR) using Hamiltonian–Jacobi equation, Pontryagin’s principle, linear quadratic tracking (LQT) and Linear Quadratic Gaussian (LQG) are also presented. The matrix and algebraic Riccati equations are derived and their numerical solutions are put forth. Application examples to illustrate finite time problem, infinite time problem and LQR design are meticulously solved.