
Optimization and Control
by Richard Weber
Publisher: University of Cambridge 2010
Number of pages: 70
Description:
Topics: Dynamic Programming; Examples of Dynamic Programming; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; Stabilizability and Observability; Kalman Filter and Certainty Equivalence; etc.
Download or read it online for free here:
Download link
(500KB, PDF)
Similar books
Distributed-Parameter Port-Hamiltonian Systemsby Hans Zwart, Birgit Jacob - CIMPA
Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.
(11931 views)
An Introduction to Mathematical Optimal Control Theoryby Lawrence C. Evans - University of California, Berkeley
Contents: Introduction; Controllability, bang-bang principle; Linear time-optimal control; The Pontryagin Maximum Principle; Dynamic programming; Game theory; Introduction to stochastic control theory; Proofs of the Pontryagin Maximum Principle.
(16818 views)
Optimal Control: Linear Quadratic Methodsby B.D.O. Anderson, J.B. Moore - Prentice-Hall
Numerous examples highlight this treatment of the use of linear quadratic Gaussian methods for control system design. It explores linear optimal control theory from an engineering viewpoint, with illustrations of practical applications.
(27193 views)
Linear Optimal Controlby B.D.O. Anderson, J.B. Moore - Prentice Hall
This book constructs a bridge between the familiar classical control results and those of modern control theory. Many modern control results do have practical engineering significance, as distinct from applied mathematical significance.
(19733 views)