Special Offers see all
More at Powell'sRecently Viewed clear list 
$28.95
New Trade Paper
Ships in 1 to 3 days
Available for Instore Pickup
in 7 to 12 days
More copies of this ISBNOther titles in the Dover Books on Advanced Mathematics series:
Optimal Control and Estimation (Dover Books on Advanced Mathematics)by Robert F. Stengel
Synopses & ReviewsPublisher Comments:"An excellent introduction to optimal control and estimation theory and its relationship with LQG design. . . . invaluable as a reference for those already familiar with the subject."—Automatica. This highly regarded graduatelevel text provides a comprehensive introduction to optimal control theory for stochastic systems, emphasizing application of its basic concepts to real problems. The first two chapters introduce optimal control and review the mathematics of control and estimation. Chapter 3 addresses optimal control of systems that may be nonlinear and timevarying, but whose inputs and parameters are known without error. Chapter 4 of the book presents methods for estimating the dynamic states of a system that is driven by uncertain forces and is observed with random measurement error. Chapter 5 discusses the general problem of stochastic optimal control, and the concluding chapter covers linear timeinvariant systems. Robert F. Stengel is Professor of Mechanical and Aerospace Engineering at Princeton University, where he directs the Topical Program on Robotics and Intelligent Systems and the Laboratory for Control and Automation. He was a principal designer of the Project Apollo Lunar Module control system. "An excellent teaching book with many examples and worked problems which would be ideal for selfstudy or for use in the classroom. . . . The book also has a practical orientation and would be of considerable use to people applying these techniques in practice."—Short Book Reviews, Publication of the International Statistical Institute. "An excellent book which guides the reader through most of the important concepts and techniques. . . . A useful book for students (and their teachers) and for those practicing engineers who require a comprehensive reference to the subject."—Library Reviews, The Royal Aeronautical Society. Book News Annotation:Reprint of the respected Wiley edition originally published in 1986.
Annotation c. Book News, Inc., Portland, OR (booknews.com) Synopsis:Graduatelevel text provides introduction to optimal control theory for stochastic systems, emphasizing application of basic concepts to real problems. "Invaluable as a reference for those already familiar with the subject." — Automatica. Synopsis:Graduatelevel text provides introduction to optimal control theory for stochastic systems, emphasizing the application of basic concepts to real problems. "An excellent intorduction to optimal control and estimation theory and its relationship with LQG design. . . . invaluable as a reference for those already familiar with the subject." — Automatica. Table of Contents1. INTRODUCTION
1.1 Framework for Optimal Control 1.2 Modeling Dynamic Systems 1.3 Optimal Control Objectives 1.4 Overview of the Book Problems References 2. THE MATHEMATICS OF CONTROL AND ESTIMATION 2.1 "Scalars, Vectors, and Matrices " Scalars Vectors Matrices Inner and Outer Products "Vector Lengths, Norms, and Weighted Norms " "Stationary, Minimum, and Maximum Points of a Scalar Variable (Ordinary Maxima and Minima) " Constrained Minima and Lagrange Multipliers 2.2 Matrix Properties and Operations Inverse Vector Relationship Matrix Determinant Adjoint Matrix Matrix Inverse Generalized Inverses Transformations Differentiation and Integration Some Matrix Identities Eigenvalues and Eigenvectors Singular Value Decomposition Some Determinant Identities 2.3 Dynamic System Models and Solutions Nonlinear System Equations Local Linearization Numerical Integration of Nonlinear Equasions Numerical Integration of Linear Equations Representation of Data 2.4 "Random Variables, Sequences, and Processes " Scalar Random Variables Groups of Random Variables Scalar Random Sequences and Processes Correlation and Covariance Functions Fourier Series and Integrals Special Density Functions of Random Processes Spectral Functions of Random Sequences Multivariate Statistics 2.5 Properties of Dynamic Systems Static and Quasistatic Equilibrium Stability "Modes of Motion for Linear, TimeInvariant Systems " "Reachability, Controllability, and Stabilizability " "Constructability, Observability, and Detectability " DiscreteTime Systems 2.6 Frequency Domain Modeling and Analysis Root Locus FrequencyResponse Function and Bode Plot Nyquist Plot and Stability Criterion Effects of Sampling Problems References 3. OPTIMAL TRAJECTORIES AND NEIGHBORINGOPTIMAL SOLUTIONS 3.1 Statement of the Problem 3.2 Cost Functions 3.3 Parametric Optimization 3.4 Conditions for Optimality Necessary Conditions for Optimality Sufficient Conditions for Optimality The Minimum Principle The HamiltonnJacobiBellman Equation 3.5 Constraints and Singular Control Terminal State Equality Constraints Equality Constraints on the State and Control Inequality Constraints on the State and Control Singular Control 3.6 Numerical Optimization Penalty Function Method Dynamic Programming Neighboring Extremal Method Quasilinearization Method Gradient Methods 3.7 NeighboringOptimal Solutions Continuous NeighboringOptimal Control Dynamic Programming Solution for Continuous LinearQuadratic Control Small Disturbances and Parameter Variations Problems References 4. OPTIMAL STATE ESTIMATION 4.1 LeastSquares Estimates of Constant Vectors LeastSquares Estimator Weighted LeastSquares Estimator Recursive LeastSquares Estimator 4.2 Propagation of the State Estimate and Its Uncertainty Discrete Time Systems SampledData Representation of ContinuousTime Systems ContinuousTime Systems Simulating CrossCorrelated White Noise 4.3 DiscreteTime Optimal Filters and Predictors Kalman Filter LinearOptimal Predictor Alternative Forms of the LinearOptimal filter 4.4 Correlated Disturbance Inputs and Measurement Noise CrossCorrelation of Disturbance Input and Measurement Noise TimeCorrelated Measurement Noise 4.5 ContinuousTime Optimal Filters and Predictors KalmanBucy Filter Duality LinearOptimal Predictor Alternative Forms of the LinearOptimal Filter Correlation in Disturbance Inputs and Measurement Noise 4.6 Optimal Nonlinear Estimation NeighboringOptimal Linear Estimator Extended KalmanBucy Filter Quasilinear Filter 4.7 Adaptive Filtering ParameterAdaptive Filtering NoiseAdaptive Filtering MultipleModel Estimation Problems References 5. STOCHASTIC OPTIMAL CONTROL 5.1 Nonlinear Systems with Random Inputs and Perfect Measurements Stochastic Principle of Optimality for Nonlinear Systems Stochastic Principle of Optimality for LinearQuadratic Problems NeighboringOptimal Control Evaluation of the Variational Cost Function 5.2 Nonlinear Systems with Random Inputs and Imperfect Measurements Stochastic Principle of Optimality Dual Control NeigbboringOptimal Control 5.3 The CertaintyEquivalence Property of LinearQuadraticGaussian Controllers The ContinuousTime Case The DiscreteTime Case Additional Cases Exhibiting Certainty Equivalence 5.4 "Linear, TimeInvariant Systems with Random Inputs and Imperfect Measurements " Asymptotic Stability of the LinearQuadratic Regulator Asymptotic Stability of the KalmanBucy Filter Asymptotic Stability of the Stochastic Regulator SteadyState Performance of the Stochastic Regulator The DiscreteTime Case Problems References 6. LINEAR MULTIVARIABLE CONTROL 6.1 Solution of the Algeb What Our Readers Are SayingBe the first to add a comment for a chance to win!Product Details
Other books you might likeRelated Subjects
Computers and Internet » Artificial Intelligence » Robotics


