Optimal Control (7)
Robust Control (5)
e-books in Control Systems category
by Panos J. Antsaklis, Kevin M. Passino (eds) - Kluwer Academic Publishers , 1993
An introduction to the field of intelligent control with a broad treatment of topics by several authors (including hierarchical / distributed intelligent control, fuzzy control, expert control, neural networks, planning systems, and applications).
by Roy D. Byrd - Delmar Publishers , 1972
These materials are intended to provide a meaningful experience with automatic controls for students of modern technology. The topics included provide exposure to basic principles of control systems, transducers, actuators, amplifiers, controllers.
by Vincent Del Toro, Sydney R. Parker - McGraw-Hill , 1960
This is an integrated treatment of feedback control systems at the senior-graduate level. In order to emphasize the unified approach, the book is divided into five sections. Each section deals with a fundamental phase of control systems engineering.
by Derek P. Atherton - Bookboon , 2013
The purpose of this book is to provide both worked examples and additional problems with answers. A major objective is to enable the reader to develop confidence in analytical work by showing how calculations can be checked using Matlab/Simulink.
by R. Timman , 1975
The lectures present an introduction to modern control theory. Calculus of variations is used to study the problem of determining the optimal control for a deterministic system without constraints and for one with constraints.
by Ivan Ganchev Ivanov (ed.) - InTech , 2012
The book provides a self-contained treatment on practical aspects of stochastic modeling and calculus including applications in engineering, statistics and computer science. Readers should be familiar with probability theory and stochastic calculus.
by Ginalber Luiz de Oliveira Serra (ed.) - InTech , 2012
This book brings the state-of-art research results on advanced control from both the theoretical and practical perspectives. The fundamental and advanced research results and technical evolution of control theory are of particular interest.
by M. H. A. Davis - Tata Institute of Fundamental Research , 1984
There are actually two separate series of lectures, on controlled stochastic jump processes and nonlinear filtering respectively. They are united however, by the common philosophy of treating Markov processes by methods of stochastic calculus.
by Derek Atherton - BookBoon , 2011
The book is concerned with the effects of nonlinearity in feedback control systems and techniques which can be used to design feedback loops containing nonlinear elements. The material is of an introductory nature but hopefully gives an overview.
by Meral Altinay - InTech , 2012
A trend of investigation of Nonlinear Control Systems has been present over the last few decades. This book includes topics such as Feedback Linearization, Lyapunov Based Control, Adaptive Control, Optimal Control and Robust Control.
by Eitan Altman, Bruno Gaujal, Arie Hordijk - Springer , 2003
Opening new directions in research in stochastic control, this book focuses on a wide class of control and of optimization problems over sequences of integer numbers. The theory is applied to the control of stochastic discrete-event dynamic systems.
by Tao Zheng - InTech , 2011
Model Predictive Control refers to a class of control algorithms in which a dynamic process model is used to predict and optimize process performance. From lower request to complicated process plants, MPC has been accepted in many practical fields.
by Jean-Michel Coron - American Mathematical Society , 2009
This book presents methods to study the controllability and the stabilization of nonlinear control systems in finite and infinite dimensions. Examples are given where nonlinearities turn out to be essential to get controllability or stabilization.
by Mario Alberto Jordan - InTech , 2011
This book covers the wide area of Discrete-Time Systems. Their contents are grouped conveniently in sections according to significant areas, namely Filtering, Fixed and Adaptive Control Systems, Stability Problems and Miscellaneous Applications.
by Tamer Mansour - InTech , 2011
The PID controller is considered the most widely used controller. It has numerous applications varying from industrial to home appliances. This book is an outcome of contributions and inspirations from many researchers in the field of PID control.
by Esteban Tlelo-Cuautle - InTech , 2011
This book presents a collection of major developments in chaos systems covering aspects on chaotic behavioral modeling and simulation, control and synchronization of chaos systems, and applications like secure communications.
by M.R. James - Australian National University , 2005
These notes are an overview of some aspects of optimal and robust control theory considered relevant to quantum control. The notes cover classical deterministic optimal control, classical stochastic and robust control, and quantum feedback control.
by Francesco Bullo, Jorge Cortes, Sonia Martinez - Princeton University Press , 2009
This introductory book offers a distinctive blend of computer science and control theory. The book presents a broad set of tools for understanding coordination algorithms, determining their correctness, and assessing their complexity.
by S. Boyd, L. El Ghaoui, E. Feron, V. Balakrishnan , 1997
The authors reduce a wide variety of problems arising in system and control theory to a handful of optimization problems that involve linear matrix inequalities. These problems can be solved using recently developed numerical algorithms.
by Wilson J. Rugh - The Johns Hopkins University Press , 1981
Contents: Input/Output Representations in the Time and Transform Domain; Obtaining Input/Output Representations from Differential-Equation Descriptions; Realization Theory; Response Characteristics of Stationary Systems; Discrete-Time Systems; etc.
by Stephen Boyd, Craig Barratt - Prentice-Hall , 1991
The book is motivated by the development of high quality integrated sensors and actuators, powerful control processors, and hardware and software that can be used to design control systems. Written for students and industrial control engineers.
by T.T. Tay, I.M.Y. Mareels, J.B. Moore - Birkhauser , 1997
Using the tools of optimal control, robust control and adaptive control, the authors develop the theory of high performance control. Topics include performance enhancement, stabilizing controllers, offline controller design, and dynamical systems.
by Petr Husek - InTech , 2008
The the book covers broad field of theory and applications of many different control approaches applied on dynamic systems. Output and state feedback control include among others robust control, optimal control or intelligent control methods.
by Derek Atherton - BookBoon , 2009
The book covers the basic aspects of linear single loop feedback control theory. Explanations of the mathematical concepts used in classical control such as root loci, frequency response and stability methods are explained by making use of MATLAB.
by Jan C. Willems - The MIT Press , 1971
This monograph develops further and refines methods based on input -output descriptions for analyzing feedback systems. Contrary to previous work in this area, the treatment heavily emphasizes and exploits the causality of the operators involved.
by Bruce A. Francis - Springer , 1987
An elementary treatment of linear control theory with an H-infinity optimality criterion. The systems are all linear, timeinvariant, and finite-dimensional and they operate in continuous time. The book has been used in a one-semester graduate course.
by John Doyle, Bruce Francis, Allen Tannenbaum , 1990
The book presents a theory of feedback control systems. It captures the essential issues, can be applied to a wide range of practical problems, and is as simple as possible. Addressed to students who have had a course in signals and systems.
by R. Sepulchre, M. Jankovic, P. Kokotovic - Springer , 1996
Several streams of nonlinear control theory are directed towards a constructive solution of the feedback stabilization problem. Analytic, geometric and asymptotic concepts are assembled as design tools for a wide variety of nonlinear phenomena.
by K. M. Passino, S. Yurkovich - Addison Wesley , 1997
Introduction to fuzzy control with a broad treatment of topics including direct fuzzy control, nonlinear analysis, identification/ estimation, adaptive and supervisory control, and applications, with many examples, exercises and design problems.
by P. J. Antsaklis , 1997
Intelligent control describes the discipline where control methods emulate important characteristics of human intelligence. These characteristics include adaptation and learning, planning under large uncertainty and coping with large amounts of data.
by Kwanho You - InTech , 2009
This book discusses the issues of adaptive control application to model generation, adaptive estimation, output regulation and feedback, electrical drives, optical communication, neural estimator, simulation and implementation.
by Eduardo D. Sontag - Springer , 1998
This textbook introduces the basic concepts of mathematical control and system theory in a self-contained and elementary fashion. Written for mathematically mature undergraduate or beginning graduate students, as well as engineering students.
by Shankar Sastry, Marc Bodson - Prentice Hall , 1994
The book gives the major results, techniques of analysis and new directions in adaptive systems. It presents deterministic theory of identification and adaptive control. The focus is on linear, continuous time, single-input single output systems.
by Karl J. Astrom, Richard M. Murray - Princeton University Press , 2008
An introduction to the basic principles and tools for the design and analysis of feedback systems. It is intended for scientists and engineers who are interested in utilizing feedback in physical, biological, information and social systems.
by Richard M. Murray - Society for Industrial Mathematics , 2002
The prospects for control in the current and future technological environment. The text describes the role the field will play in commercial and scientific applications over the next decade, and recommends actions required for new breakthroughs.
by Andrew Whitworth - Wikibooks , 2006
An inter-disciplinary engineering text that analyzes the effects and interactions of mathematical systems. This book is for third and fourth year undergraduates in an engineering program. It considers both classical and modern control methods.
by Hugh Jack , 2005
Dynamic System Modeling and Control introduces the basic concepts of system modeling with differential equations. Supplemental materials at the end of this book include a writing guide, summary of math topics, and a table of useful engineering units.