Warriors B2G1 Free

Special Offers see all

Enter to WIN a $100 Credit

Subscribe to PowellsBooks.news
for a chance to win.
Privacy Policy

Visit our stores

    Recently Viewed clear list

    Original Essays | May 5, 2015

    Leonard Mlodinow: IMG The Fragility of Grand Discoveries

    When I was in graduate school at Berkeley I was offered a prestigious fellowship to study for a year in Germany, but I decided it would be a... Continue »
    1. $19.57 Sale Hardcover add to wish list

Qualifying orders ship free.
List price: $178.40
Used Hardcover
Ships in 1 to 3 days
Add to Wishlist
available for shipping or prepaid pickup only
Available for In-store Pickup
in 7 to 12 days
Qty Store Section
19 Partner Warehouse Biology- Genetics

Systems and Control (03 Edition)


Systems and Control (03 Edition) Cover


Synopses & Reviews

Please note that used books may not include additional media (study guides, CDs, DVDs, solutions manuals, etc.) as described in the publisher comments.

Publisher Comments:

Systems and Control presents modeling, analysis, and control of dynamical systems. Introducing students to the basics of dynamical system theory and supplying them with the tools necessary for control system design, it emphasizes design and demonstrates how dynamical system theory fits into practical applications. Classical methods and the techniques of postmodern control engineering are presented in a unified fashion, demonstrating how the current tools of a control engineer can supplement more classical tools.

Broad in scope, Systems and Control shows the multidisciplinary role of dynamics and control; presents neural networks, fuzzy systems, and genetic algorithms; and provides a self-contained introduction to chaotic systems. The text employs Lyapunov's stability theory as a unifying medium for different types of dynamical systems, using it--with its variants--to analyze dynamical system models. Specifically, optimal, fuzzy, sliding mode, and chaotic controllers are all constructed with the aid of the Lyapunov method and its extensions. In addition, a class of neural networks is also analyzed using Lyapunov's method.

Ideal for advanced undergraduate and beginning graduate courses in systems and control, this text can also be used for introductory courses in nonlinear systems and modern automatic control. It requires working knowledge of basic differential equations and elements of linear algebra; a review of the necessary mathematical techniques and terminology is provided.


This textbook deals with modelling, analysis, and control of dynamical systems. Its objective is to familiarize students with the basics of dynamical system theory while equipping them with the tools needed for control system design.

About the Author

Stanislaw H. Zak is Professor of Electrical and Computer Engineering at Purdue. He has worked in various areas of control, optimization, and neural networks. He is coauthor of Selected Methods of Analysis of Linear Dynamical Systems (1984, in Polish) and An Introduction to Optimization, 2/e (2001). Dr. Zak has also contributed to the Comprehensive Dictionary of Electrical Engineering (1999) and is a past associate editor of Dynamics and Control and the IEEE Transactions on Neural Networks.

Table of Contents

Each chapter ends with Notes and Exercises


1. Dynamical Systems and Modeling

1.1. What Is a System?

1.2. Open-Loop Versus Closed-Loop

1.3. Axiomatic Definition of a Dynamical System

1.4. Mathematical Modeling

1.5. Review of Work and Energy Concepts

1.6. The Lagrange Equations of Motion

1.7. Modeling Examples

1.7.1. Centrifugal Governor

1.7.2. Ground Vehicle

1.7.3. Permanent Magnet Stepper Motor

1.7.4. Stick Balancer

1.7.5. Population Dynamics

2. Analysis of Modeling Equations

2.1. State Plane Analysis

2.1.1. Examples of Phase Portraits

2.1.2. The Method of Isoclines

2.2. Numerical Techniques

2.2.1. The Method of Taylor Series

2.2.2. Euler's Methods

2.2.3. Predictor-Corrector Method

2.2.4. Runge's Method

2.2.5. Runge-Kutta Method

2.3. Principles of Linearization

2.4. Linearizing Differential Equations

2.5. Describing Function Method

2.5.1. Scalar Product of Functions

2.5.2. Fourier Series

2.5.3. Describing Function in the Analysis of Nonlinear Systems

3. Linear Systems

3.1. Reachability and Controllability

3.2. Observability and Constructability

3.3. Companion Forms

3.3.1. Controller Form

3.3.2. Observer Form

3.4. Linear State-Feedback Control

3.5. State Estimators

3.5.1. Full-Order Estimator

3.5.2. Reduced-Order Estimator

3.6. Combined Controller-Estimator Compensator

4. Stability

4.1. Informal Introduction to Stability

4.2. Basic Definitions of Stability

4.3. Stability of Linear Systems

4.4. Evaluating Quadratic Indices

4.5. Discrete-Time Lyapunov Equation

4.6. Constructing Robust Linear Controllers

4.7. Hurwitz and Routh Stability Criteria

4.8. Stability of Nonlinear Systems

4.9. Lyapunov's Indirect Method

4.10. Discontinuous Robust Controllers

4.11. Uniform Ultimate Boundedness

4.12. Lyapunov-Like Analysis

4.13. LaSalle's Invariance Principle

5. Optimal Control

5.1. Performance Indices

5.2. A Glimpse at the Calculus of Variations

5.2.1. Variation and Its Properties

5.2.2. Euler-Lagrange Equation

5.3. Linear Quadratic Regulator

5.3.1. Algebraic Riccati Equation (ARE)

5.3.2. Solving the ARE Using the Eigenvector Method

5.3.3. Optimal Systems with Prescribed Poles

5.3.4. Optimal Saturating Controllers

5.3.5. Linear Quadratic Regulator for Discrete Systems on an Infinite Time Interval

5.4. Dynamic Programming

5.4.1. Discrete-Time Systems

5.4.2. Discrete Linear Quadratic Regulator Problem

5.4.3. Continuous Minimum Time Regulator Problem

5.4.4. The Hamilton-Jacobi-Bellman Equation

5.5. Pontryagin's Minimum Principle

5.5.1. Optimal Control With Constraints on Inputs

5.5.2. A Two-Point Boundary-Value Problem

6. Sliding Modes

6.1. Simple Variable Structure Systems

6.2. Sliding Mode

6.3. A Simple Sliding Mode Controller

6.4. Sliding in Multi-Input Systems

6.5. Sliding Mode and System Zeros

6.6. Nonideal Sliding Mode

6.7. Sliding Surface Design

6.8. State Estimation of Uncertain Systems

6.8.1. Discontinuous Estimators

6.8.2. Boundary Layer Estimators

6.9. Sliding Modes in Solving Optimization Problems

6.9.1. Optimization Problem Statement

6.9.2. Penalty Function Method

6.9.3. Dynamical Gradient Circuit Analysis

7. Vector Field Methods

7.1. A Nonlinear Plant Model

7.2. Controller Form

7.3. Linearizing State-Feedback Control

7.4. Observer Form

7.5. Asymptotic State Estimator

7.6. Combined Controller-Estimator Compensator

8. Fuzzy Systems

8.1. Motivation and Basic Definitions

8.2. Fuzzy Arithmetic and Fuzzy Relations

8.2.1. Interval Arithmetic

8.2.2. Manipulating Fuzzy Numbers

8.2.3. Fuzzy Relations

8.2.4. Composition of Fuzzy Relations

8.3. Standard Additive Model

8.4. Fuzzy Logic Control

8.5. Stabilization Using Fuzzy Models

8.5.1. Fuzzy Modeling

8.5.2. Constructing a Fuzzy Design Model Using a Nonlinear Model

8.5.3. Stabilizability of Fuzzy Models

8.5.4. A Lyapunov-Based Stabilizer

8.6. Stability of Discrete Fuzzy Models

8.7. Fuzzy Estimator

8.7.1. The Comparison Method for Linear Systems

8.7.2. Stability Analysis of the Closed-Loop System

8.8. Adaptive Fuzzy Control

8.8.1. Plant Model and Control Objective

8.8.2. Background Results

8.8.3. Controllers

8.8.4. Examples

9. Neural Networks

9.1. Threshold Logic Unit

9.2. Identification Using Adaptive Linear Element

9.3. Backpropagation

9.4. Neural Fuzzy Identifier

9.5. Radial-Basis Function (RBF) Networks

9.5.1. Interpolation Using RBF Networks

9.5.2. Identification of a Single-Input, Single-State System

9.5.3. Learning Algorithm for the RBF Identifier

9.5.4. Growing RBF Network

9.5.5. Identification of Multivariable Systems

9.6. A Self-Organizing Network

9.7. Hopfield Neural Network

9.7.1. Hopfield Neural Network Modeling and Analysis

9.7.2. Analog-to-Digital Converter

9.8. Hopfield Network Stability Analysis

9.8.1. Hopfield Network Model Analysis

9.8.2. Single Neuron Stability Analysis

9.8.3. Stability Analysis of the Network

9.9. Brain-State-in-a-Box (BSB) Models

9.9.1. Associative Memories

9.9.2. Analysis of BSB Models

9.9.3. Synthesis of Neural Associative Memory

9.9.4. Learning

9.9.5. Forgetting

10. Genetic and Evolutionary Algorithms

10.1. Genetics as an Inspiration for an Optimization Approach

10.2. Implementing a Canonical Genetic Algorithm

10.3. Analysis of the Canonical Genetic Algorithm

10.4. Simple Evolutionary Algorithm (EA)

10.5. Evolutionary Fuzzy Logic Controllers

10.5.1. Vehicle Model and Control Objective

10.5.2. Case 1: EA Tunes Fuzzy Rules

10.5.3. Case 2: EA Tunes Fuzzy Membership Functions

10.5.4. Case 3: EA Tunes Fuzzy Rules and Membership Functions

11. Chaotic Systems and Fractals

11.1. Chaotic Systems Are Dynamical Systems with Wild Behavior

11.2. Chaotic Behavior of the Logistic Equation

11.2.1. The Logistic Equation---An Example From Ecology

11.2.2. Stability Analysis of the Logistic Map

11.2.3. Period Doubling to Chaos

11.2.4. The Feigenbaum Numbers

11.3. Fractals

11.3.1. The Mandelbrot Set

11.3.2. Julia Sets

11.3.3. Iterated Function Systems

11.4. Lyapunov Exponents

11.5. Discretization Chaos

11.6. Controlling Chaotic Systems

11.6.1. Ingredients of Chaotic Control Algorithm

11.6.2. Chaotic Control Algorithm

Appendix: Math Review

A.1. Notation and Methods of Proof

A.2. Vectors

A.3. Matrices and Determinants

A.4. Quadratic Forms

A.5. The Kronecker Product

A.6. Upper and Lower Bounds

A.7. Sequences

A.8. Functions

A.9. Linear Operators

A.10. Vector Spaces

A.11. Least Squares

A.12. Contraction Maps

A.13. First-Order Differential Equation

A.14. Integral and Differential Inequalities

A.14.1. The Bellman-Gronwall Lemma

A.14.2. A Comparison Theorem

A.15. Solving the State Equations

A.15.1. Solution of Uncontrolled System

A.15.2. Solution of Controlled System

A.16. Curves and Surfaces

A.17. Vector Fields and Curve Integrals

A.18. Matrix Calculus Formulas





Product Details

Zak, Stanislaw H.
Oxford University Press
null, Stanislaw H.
Zak, Stansilaw H.
New York
Engineering - Electrical & Electronic
Engineering - Mechanical
Computer Engineering
Linear control systems
Engineering / Electrical
Technology | Electrical
Engineering and Technology | Electrical and Computer Engineering
Electronics - General
Engineering & Technology | Electrical & Computer Engineering
Artificial Intelligence-Robotics
Edition Description:
Includes bibliographical references and index.
Engineering & Technology
Series Volume:
no. 13
Publication Date:
Grade Level:
College/higher education:
314 line illus
7.7 x 9.2 x 1.5 in 3.056 lb

Other books you might like

  1. Mechanical Wear Prediction and... New Hardcover $324.25

Related Subjects

Engineering » Mechanical Engineering » General
Science and Mathematics » Biology » Genetics

Systems and Control (03 Edition) Used Hardcover
0 stars - 0 reviews
$134.00 In Stock
Product details 720 pages Oxford University Press, USA - English 9780195150117 Reviews:
"Synopsis" by , This textbook deals with modelling, analysis, and control of dynamical systems. Its objective is to familiarize students with the basics of dynamical system theory while equipping them with the tools needed for control system design.
  • back to top


Powell's City of Books is an independent bookstore in Portland, Oregon, that fills a whole city block with more than a million new, used, and out of print books. Shop those shelves — plus literally millions more books, DVDs, and gifts — here at Powells.com.