Welcome to E-Books Directory
This page lists freely downloadable books.


e-books in this category

An Introduction to Mathematical Optimal Control TheoryAn Introduction to Mathematical Optimal Control Theory
by Lawrence C. Evans - University of California, Berkeley , 2010
Contents: Introduction; Controllability, bang-bang principle; Linear time-optimal control; The Pontryagin Maximum Principle; Dynamic programming; Game theory; Introduction to stochastic control theory; Proofs of the Pontryagin Maximum Principle.

Optimization and ControlOptimization and Control
by Richard Weber - University of Cambridge , 2010
Topics: Dynamic Programming; Dynamic Programming Examples; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; etc.

Distributed-Parameter Port-Hamiltonian SystemsDistributed-Parameter Port-Hamiltonian Systems
by Hans Zwart, Birgit Jacob - CIMPA , 2009
Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.

Modeling, Simulation and Optimization: Tolerance and Optimal ControlModeling, Simulation and Optimization: Tolerance and Optimal Control
by Shkelzen Cakaj - InTech , 2010
Topics covered: parametric representation of shapes, modeling of dynamic continuous fluid flow process, plant layout optimal plot plan, atmospheric modeling, cellular automata simulations, thyristor switching characteristics simulation, etc.

Stochastic Optimal Control: The Discrete-Time CaseStochastic Optimal Control: The Discrete-Time Case
by Dimitri P. Bertsekas, Steven E. Shreve - Athena Scientific , 1996
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.

Optimal Control: Linear Quadratic MethodsOptimal Control: Linear Quadratic Methods
by B.D.O. Anderson, J.B. Moore - Prentice-Hall , 1989
Numerous examples highlight this treatment of the use of linear quadratic Gaussian methods for control system design. It explores linear optimal control theory from an engineering viewpoint, with illustrations of practical applications.

Linear Optimal ControlLinear Optimal Control
by B.D.O. Anderson, J.B. Moore - Prentice Hall , 1971
This book constructs a bridge between the familiar classical control results and those of modern control theory. Many modern control results do have practical engineering significance, as distinct from applied mathematical significance.

Linear Optimal Control SystemsLinear Optimal Control Systems
by H. Kwakernaak, R. Sivan - Wiley-Interscience , 1972
One of the major concerns of this text is to present design methods, employing modern techniques, for obtaining control systems that stand up to the requirements that have been so well developed in the classical expositions of control theory.

Finite Dimensional Linear SystemsFinite Dimensional Linear Systems
by Roger W. Brockett - John Wiley and Sons , 1970
This book is based on a course on dynamical systems given at the MIT. The topics covered form the core for advanced work in such fields of study as optimal control, estimation, stability, electrical networks, and the control of distributed systems.