This work explores the fundamental problems of estimation, identification, and control of dynamic systems from the viewpoint of modern control theory. The book reviews existing concepts of optimal control and estimation theory in order to ascertain their relative merit and their limitations. The “Bayesian approach” to stochastic estimation is established and then applied to both Gaussian and non-Gaussian, nonlinear system parameters in a stochastic environment is developed and verified by experiments. Finally, the problems of generating closed loop control laws are discussed and a simplified closed loop philosophy for a rather general class of dynamic systems is developed.
Optimal Estimation, Identification, and Control is a book well suited as a reference on the graduate level, for the student interested in, but not yet familiar with modern control theory, since the study offers a bird's-eye-view of current concepts.
Those aeronautical, mechanical, and electrical engineers interested in automatic control and optimization will find the book useful.