An Introduction to Mathematical Optimal Control Theory
by Lawrence C. Evans
Publisher: University of California, Berkeley 2010
Number of pages: 126
Description:
Contents: Introduction; Controllability, bang-bang principle; Linear time-optimal control; The Pontryagin Maximum Principle; Dynamic programming; Game theory; Introduction to stochastic control theory; Proofs of the Pontryagin Maximum Principle.
Download or read it online for free here:
Download link
(690KB, PDF)
Similar books

by B.D.O. Anderson, J.B. Moore - Prentice Hall
This book constructs a bridge between the familiar classical control results and those of modern control theory. Many modern control results do have practical engineering significance, as distinct from applied mathematical significance.
(15813 views)

by Dimitri P. Bertsekas, Steven E. Shreve - Athena Scientific
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.
(13051 views)

by Hans Zwart, Birgit Jacob - CIMPA
Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.
(10103 views)

by Richard Weber - University of Cambridge
Topics: Dynamic Programming; Dynamic Programming Examples; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; etc.
(12211 views)