Limit this search to....

Nonlinear and Optimal Control Systems
Contributor(s): Vincent, Thomas L. (Author), Grantham, Walter J. (Author)
ISBN: 0471042358     ISBN-13: 9780471042358
Publisher: Wiley-Interscience
OUR PRICE:   $189.95  
Product Type: Hardcover
Published: June 1997
Qty:
Annotation: Nonlinear and Optimal Control Systems offers a self-contained introduction to analysis techniques used in the design of nonlinear and optimal feedback control systems, with a solid emphasis on the fundamental topics of stability, controllability, optimality, and the corresponding geometry. The book develops and presents these key subjects in a unified fashion. An integrated approach is used to develop stability theory, function minimizing feedback controls, optimal controls, and differential game theory.

Starting with a background on differential equations, this accessible text examines nonlinear dynamical systems and nonlinear control systems, including basic results in nonlinear parameter optimization and parametric two-player games. Lyapunov stability theory and control system design are discussed in detail, followed by in-depth coverage of the controllability minimum principle and other important controllability concepts. The optimal control (Pontryagin's) minimum principle is developed and then applied to optimal control problems and the design of optimal controllers.

Nonlinear and Optimal Control Systems features examples and exercises taken from a wide range of disciplines and contexts--from engineering control designs to biological, economic, and other systems. Numerical algorithms are provided for solving problems in optimization and control, as well as simulation of systems using nonlinear differential equations. Readers may choose to develop their own code from these algorithms or solve problems with the help of commercial software programs.

Providing readers with a sturdy foundation in nonlinear and optimal control system design and application, this new resource is avaluable asset to advanced students and professional engineers in many different fields.

An integrated approach to the fundamentals of nonlinear and optimal control systems. This self-contained text provides a solid introduction to the analysis techniques used in the design of nonlinear and optimal feedback control systems. Building on thorough coverage of the basic concepts of stability, controllability, and optimality, the book develops highly effective feedback controllers for stability, function minimizing control, optimal control, and two-player differential games.

Concepts are illustrated throughout with examples that represent a range of disciplines and design contexts, bridging theory and application for advanced students and practicing engineers in many different fields. The book features: An accessible introduction to nonlinear system dynamics Lyapunov stability theory and control system design Controllability of nonlinear systems, including in-depth treatment of the controllability minimum principle and Pontryagin's minimum principle Optimal control systems and design using Pontryagin's minimum principle Differential games and design concepts--qualitative and quantitative games, Isaacs' min-max principle, and more Numerical algorithms for control, optimization, and simulation of nonlinear dynamical systems

Additional Information
BISAC Categories:
- Technology & Engineering | Mechanical
- Science
Dewey: 629.83
LCCN: 96-37129
Physical Information: 1.34" H x 6.57" W x 9.56" (2.23 lbs) 576 pages
 
Descriptions, Reviews, Etc.
Publisher Description:
Control systems are developed in order to manage or control the output of a given process. Feedback control systems continually monitor the final output and automatically control the process to fall within a given or set range. For example, the thermostat in a home not only controls the on-off settings but monitors the room temperature and applies heat or cooling to maintain a balance. Such a system would be known as a linear control system. In reality there are few such systems - most have numerous inputs and outputs that need to be taken into account if the controls are to work properly or in an optimal range. This book focuses on these nonlinear systems and provides an introduction to analysis techniques used in the design of nonlinear and optimal feedback control systems. The emphasis is on the fundamental topics of stability, controllability, and optimality and on the corresponding geometry associated with these topics.