Logo

An Introduction to Mathematical Optimal Control Theory

Small book cover: An Introduction to Mathematical Optimal Control Theory

An Introduction to Mathematical Optimal Control Theory
by

Publisher: University of California, Berkeley
Number of pages: 126

Description:
Contents: Introduction; Controllability, bang-bang principle; Linear time-optimal control; The Pontryagin Maximum Principle; Dynamic programming; Game theory; Introduction to stochastic control theory; Proofs of the Pontryagin Maximum Principle.

Home page url

Download or read it online for free here:
Download link
(690KB, PDF)

Similar books

Book cover: Distributed-Parameter Port-Hamiltonian SystemsDistributed-Parameter Port-Hamiltonian Systems
by - CIMPA
Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.
(10874 views)
Book cover: Optimization and ControlOptimization and Control
by - University of Cambridge
Topics: Dynamic Programming; Dynamic Programming Examples; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; etc.
(13031 views)
Book cover: Linear Optimal ControlLinear Optimal Control
by - Prentice Hall
This book constructs a bridge between the familiar classical control results and those of modern control theory. Many modern control results do have practical engineering significance, as distinct from applied mathematical significance.
(17201 views)
Book cover: Optimal Control: Linear Quadratic MethodsOptimal Control: Linear Quadratic Methods
by - Prentice-Hall
Numerous examples highlight this treatment of the use of linear quadratic Gaussian methods for control system design. It explores linear optimal control theory from an engineering viewpoint, with illustrations of practical applications.
(22707 views)