Stochastic Optimal Control: The Discrete-Time Case
by Dimitri P. Bertsekas, Steven E. Shreve
Publisher: Athena Scientific 1996
ISBN/ASIN: 1886529035
Number of pages: 331
Description:
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.
Download or read it online for free here:
Download link
(multiple PDF files)
Similar books
Optimal Control: Linear Quadratic Methodsby B.D.O. Anderson, J.B. Moore - Prentice-Hall
Numerous examples highlight this treatment of the use of linear quadratic Gaussian methods for control system design. It explores linear optimal control theory from an engineering viewpoint, with illustrations of practical applications.
(27080 views)
Distributed-Parameter Port-Hamiltonian Systemsby Hans Zwart, Birgit Jacob - CIMPA
Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.
(11911 views)
Modeling, Simulation and Optimization: Tolerance and Optimal Controlby Shkelzen Cakaj - InTech
Topics covered: parametric representation of shapes, modeling of dynamic continuous fluid flow process, plant layout optimal plot plan, atmospheric modeling, cellular automata simulations, thyristor switching characteristics simulation, etc.
(18669 views)
Optimization and Controlby Richard Weber - University of Cambridge
Topics: Dynamic Programming; Dynamic Programming Examples; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; etc.
(14513 views)