Applications of Optimal Control Theory to Computer Controller Design
Over the past two decades, since control-system design was first approached as an optimization problem, a steady flow of intensive research has produced analytical design procedure that give great insight into the nature of such systems. There continues to be, however, a large gap between the theoretical progress of optimal control theory in the literature and the analytical design tools actually used by control-system engineers. The intent of this book is to develop and exploit linear quadratic cost optimal control theory and the use of modern digital computers in designing practical feedback-control systems. Many examples are given to illustrate the suggested techniques and to demonstrate the success of the approaches.
After formulating the problem of computer control of a continuous stochastic process, the author proposes a systematic procedure for evaluating the performance of any linear control computer program, whether optimal or not, followed by a review of the details of optimal controller synthesis and a discussion of its applications to the design of computer-limited controllers. In this connection, the Joesph and Tou synthesis is shown to yield an optimal controller that is not unique, and it is demonstrated that the optimal controller computer program can be transformed into a canonical form capable of minimizing computations. Several indirect applications of optimal controller synthesis to computer-limited designs are discussed, and the final chapter offers modifications on the design philosophy of Phillips, permitting better convergence to an optimized design. Throughout, the author gives realistic numerical examples to support his theories, and in the appendixes includes a package of computer programs useful in the design of linear sampled-data controllers.
The book should be of great value to engineers wishing to use the latest analytical and numerical design techniques for the design of computer controllers.