ABSTRACT

The variety of suitable solutions complicates the design process as it increases the amount of dilemmas, facing the control designer and, as a result, the question “what is the optimal solution?” naturally arises. In structural control most of the problems are concerned only with reducing the process cost while the terminal cost has no significance. A fundamental optimal control problem, is to find the optimal control trajectories when no constraints are imposed on the desired optimum. Pontryagin’s minimum principle was used in many optimal control problems. A method for optimal control of bilinear systems with quadratic perfor-mance index was presented with emphasis on the finite horizon problem.