ABSTRACT

The field has been evolving since its inception, and the methods range from a relatively simple PID controller to the more sophisticated optimal control methods. In this chapter, the authors mathematically explain a few essential control methods in simple terms. Optimal control methods are presented in more detail, covering performance measures, dynamic programming, and Pontryagin's minimization principle. A discussion concerning the application of optimal control methods in social media systems concludes the chapter. There are two main approaches to solve an optimal control problem dynamic programming principle and Pontryagin's minimization principle. Using the concepts of dynamic programming and the principle of optimality, the authors obtain the Hamilton-Jacobi-Bellman equation, a partial differential equation, which establishes the basis for optimal control theory. Pontryagin's minimization principle is prominent approach to optimal control based on the calculus of variations. Any candidate trajectory is not optimal if it fails to satisfy the necessary conditions laid by the minimum principle.