Adaptive Control: Stability, Convergence, and Robustness
by Shankar Sastry, Marc Bodson
Publisher: Prentice Hall 1994
ISBN/ASIN: 0130043265
ISBN-13: 9780130043269
Number of pages: 378
Description:
The objective of this book is to give the major results, techniques of analysis and new directions of research in adaptive systems. The authors give a clear, conceptual presentation of adaptive methods, to enable a critical evaluation of these techniques and suggest avenues of further development. The book presents deterministic theory of identification and adaptive control. The focus is on linear, continuous time, single-input single output systems.
Download or read it online for free here:
Download link
(multiple PDF files)
Similar books
![Book cover: Control Engineering: An introduction with the use of Matlab](images/3407.jpg)
by Derek Atherton - BookBoon
The book covers the basic aspects of linear single loop feedback control theory. Explanations of the mathematical concepts used in classical control such as root loci, frequency response and stability methods are explained by making use of MATLAB.
(16800 views)
![Book cover: Discrete-Event Control of Stochastic Networks: Multimodularity and Regularity](images/7037.jpg)
by Eitan Altman, Bruno Gaujal, Arie Hordijk - Springer
Opening new directions in research in stochastic control, this book focuses on a wide class of control and of optimization problems over sequences of integer numbers. The theory is applied to the control of stochastic discrete-event dynamic systems.
(10693 views)
![Book cover: Dynamic System Modeling and Control](images/1103.jpg)
by Hugh Jack
Dynamic System Modeling and Control introduces the basic concepts of system modeling with differential equations. Supplemental materials at the end of this book include a writing guide, summary of math topics, and a table of useful engineering units.
(23467 views)
![Book cover: Control Theory with Applications to Naval Hydrodynamics](images/9063.jpg)
by R. Timman
The lectures present an introduction to modern control theory. Calculus of variations is used to study the problem of determining the optimal control for a deterministic system without constraints and for one with constraints.
(11616 views)