Welcome to E-Books Directory
This page lists freely downloadable books.
E-Books for free online viewing and/or download
e-books in this category
Control Engineering Problems with Solutions
by Derek P. Atherton - Bookboon , 2013
The purpose of this book is to provide both worked examples and additional problems with answers. A major objective is to enable the reader to develop confidence in analytical work by showing how calculations can be checked using Matlab/Simulink.
Control Theory with Applications to Naval Hydrodynamics
by R. Timman , 1975
The lectures present an introduction to modern control theory. Calculus of variations is used to study the problem of determining the optimal control for a deterministic system without constraints and for one with constraints.
Stochastic Systems: Estimation, Identification and Adaptive Control
by P.R. Kumar, Pravin Varaiya - Prentice Hall , 1986
This book is concerned with the questions of modeling, estimation, optimal control, identification, and the adaptive control of stochastic systems. The treatment is unified by adopting the viewpoint of one who must make decisions under uncertainty.
Stochastic Modeling and Control
by Ivan Ganchev Ivanov (ed.) - InTech , 2012
The book provides a self-contained treatment on practical aspects of stochastic modeling and calculus including applications in engineering, statistics and computer science. Readers should be familiar with probability theory and stochastic calculus.
Frontiers in Advanced Control Systems
by Ginalber Luiz de Oliveira Serra (ed.) - InTech , 2012
This book brings the state-of-art research results on advanced control from both the theoretical and practical perspectives. The fundamental and advanced research results and technical evolution of control theory are of particular interest.
Lectures on Stochastic Control and Nonlinear Filtering
by M. H. A. Davis - Tata Institute of Fundamental Research , 1984
There are actually two separate series of lectures, on controlled stochastic jump processes and nonlinear filtering respectively. They are united however, by the common philosophy of treating Markov processes by methods of stochastic calculus.
An Introduction to Nonlinearity in Control Systems
by Derek Atherton - BookBoon , 2011
The book is concerned with the effects of nonlinearity in feedback control systems and techniques which can be used to design feedback loops containing nonlinear elements. The material is of an introductory nature but hopefully gives an overview.
Applications of Nonlinear Control
by Meral Altinay - InTech , 2012
A trend of investigation of Nonlinear Control Systems has been present over the last few decades. This book includes topics such as Feedback Linearization, Lyapunov Based Control, Adaptive Control, Optimal Control and Robust Control.
Discrete-Event Control of Stochastic Networks: Multimodularity and Regularity
by Eitan Altman, Bruno Gaujal, Arie Hordijk - Springer , 2003
Opening new directions in research in stochastic control, this book focuses on a wide class of control and of optimization problems over sequences of integer numbers. The theory is applied to the control of stochastic discrete-event dynamic systems.
Advanced Model Predictive Control
by Tao Zheng - InTech , 2011
Model Predictive Control refers to a class of control algorithms in which a dynamic process model is used to predict and optimize process performance. From lower request to complicated process plants, MPC has been accepted in many practical fields.
Control and Nonlinearity
by Jean-Michel Coron - American Mathematical Society , 2009
This book presents methods to study the controllability and the stabilization of nonlinear control systems in finite and infinite dimensions. Examples are given where nonlinearities turn out to be essential to get controllability or stabilization.
Discrete Time Systems
by Mario Alberto Jordan - InTech , 2011
This book covers the wide area of Discrete-Time Systems. Their contents are grouped conveniently in sections according to significant areas, namely Filtering, Fixed and Adaptive Control Systems, Stability Problems and Miscellaneous Applications.
PID Control: Implementation and Tuning
by Tamer Mansour - InTech , 2011
The PID controller is considered the most widely used controller. It has numerous applications varying from industrial to home appliances. This book is an outcome of contributions and inspirations from many researchers in the field of PID control.
by Esteban Tlelo-Cuautle - InTech , 2011
This book presents a collection of major developments in chaos systems covering aspects on chaotic behavioral modeling and simulation, control and synchronization of chaos systems, and applications like secure communications.
Control Theory: From Classical to Quantum Optimal, Stochastic, and Robust Control
by M.R. James - Australian National University , 2005
These notes are an overview of some aspects of optimal and robust control theory considered relevant to quantum control. The notes cover classical deterministic optimal control, classical stochastic and robust control, and quantum feedback control.
Distributed Control of Robotic Networks
by Francesco Bullo, Jorge Cortes, Sonia Martinez - Princeton University Press , 2009
This introductory book offers a distinctive blend of computer science and control theory. The book presents a broad set of tools for understanding coordination algorithms, determining their correctness, and assessing their complexity.
Linear Matrix Inequalities in System and Control Theory
by S. Boyd, L. El Ghaoui, E. Feron, V. Balakrishnan , 1997
The authors reduce a wide variety of problems arising in system and control theory to a handful of optimization problems that involve linear matrix inequalities. These problems can be solved using recently developed numerical algorithms.
Nonlinear System Theory: The Volterra/Wiener Approach
by Wilson J. Rugh - The Johns Hopkins University Press , 1981
Contents: Input/Output Representations in the Time and Transform Domain; Obtaining Input/Output Representations from Differential-Equation Descriptions; Realization Theory; Response Characteristics of Stationary Systems; Discrete-Time Systems; etc.
Linear Controller Design: Limits of Performance
by Stephen Boyd, Craig Barratt - Prentice-Hall , 1991
The book is motivated by the development of high quality integrated sensors and actuators, powerful control processors, and hardware and software that can be used to design control systems. Written for students and industrial control engineers.
High Performance Control
by T.T. Tay, I.M.Y. Mareels, J.B. Moore - Birkhauser , 1997
Using the tools of optimal control, robust control and adaptive control, the authors develop the theory of high performance control. Topics include performance enhancement, stabilizing controllers, offline controller design, and dynamical systems.
Systems Structure and Control
by Petr Husek - InTech , 2008
The the book covers broad field of theory and applications of many different control approaches applied on dynamic systems. Output and state feedback control include among others robust control, optimal control or intelligent control methods.
Control Engineering: An introduction with the use of Matlab
by Derek Atherton - BookBoon , 2009
The book covers the basic aspects of linear single loop feedback control theory. Explanations of the mathematical concepts used in classical control such as root loci, frequency response and stability methods are explained by making use of MATLAB.
The Analysis of Feedback Systems
by Jan C. Willems - The MIT Press , 1971
This monograph develops further and refines methods based on input -output descriptions for analyzing feedback systems. Contrary to previous work in this area, the treatment heavily emphasizes and exploits the causality of the operators involved.
A Course in H-infinity Control Theory
by Bruce A. Francis - Springer , 1987
An elementary treatment of linear control theory with an H-infinity optimality criterion. The systems are all linear, timeinvariant, and finite-dimensional and they operate in continuous time. The book has been used in a one-semester graduate course.
Feedback Control Theory
by John Doyle, Bruce Francis, Allen Tannenbaum , 1990
The book presents a theory of feedback control systems. It captures the essential issues, can be applied to a wide range of practical problems, and is as simple as possible. Addressed to students who have had a course in signals and systems.
Constructive Nonlinear Control
by R. Sepulchre, M. Jankovic, P. Kokotovic - Springer , 1996
Several streams of nonlinear control theory are directed towards a constructive solution of the feedback stabilization problem. Analytic, geometric and asymptotic concepts are assembled as design tools for a wide variety of nonlinear phenomena.
by K. M. Passino, S. Yurkovich - Addison Wesley , 1997
Introduction to fuzzy control with a broad treatment of topics including direct fuzzy control, nonlinear analysis, identification/ estimation, adaptive and supervisory control, and applications, with many examples, exercises and design problems.
An Introduction to Intelligent and Autonomous Control
by P. J. Antsaklis, K. M. Passino - Springer , 1992
Introduction to the area of intelligent control by leading researchers in the area. Approaches to intelligent control, including expert control, planning systems, fuzzy control, neural control and learning control are studied in detail.
by Kwanho You - InTech , 2009
This book discusses the issues of adaptive control application to model generation, adaptive estimation, output regulation and feedback, electrical drives, optical communication, neural estimator, simulation and implementation.
Mathematical Control Theory: Deterministic Finite Dimensional Systems
by Eduardo D. Sontag - Springer , 1998
This textbook introduces the basic concepts of mathematical control and system theory in a self-contained and elementary fashion. Written for mathematically mature undergraduate or beginning graduate students, as well as engineering students.
Adaptive Control: Stability, Convergence, and Robustness
by Shankar Sastry, Marc Bodson - Prentice Hall , 1994
The book gives the major results, techniques of analysis and new directions in adaptive systems. It presents deterministic theory of identification and adaptive control. The focus is on linear, continuous time, single-input single output systems.
Feedback Systems: An Introduction for Scientists and Engineers
by Karl J. Astrom, Richard M. Murray - Princeton University Press , 2008
An introduction to the basic principles and tools for the design and analysis of feedback systems. It is intended for scientists and engineers who are interested in utilizing feedback in physical, biological, information and social systems.
Control in an Information Rich World
by Richard M. Murray - Society for Industrial Mathematics , 2002
The prospects for control in the current and future technological environment. The text describes the role the field will play in commercial and scientific applications over the next decade, and recommends actions required for new breakthroughs.
by Andrew Whitworth - Wikibooks , 2006
An inter-disciplinary engineering text that analyzes the effects and interactions of mathematical systems. This book is for third and fourth year undergraduates in an engineering program. It considers both classical and modern control methods.
Dynamic System Modeling and Control
by Hugh Jack , 2005
Dynamic System Modeling and Control introduces the basic concepts of system modeling with differential equations. Supplemental materials at the end of this book include a writing guide, summary of math topics, and a table of useful engineering units.