Mathematical theory of control systems design /

Saved in:
Bibliographic Details
Main Author: Afanasiev, V. N. (Valerii Nikolaevich)
Other Authors: Kolmanovskiĭ, Vladimir Borisovich, Nosov, V. R.
Format: Book
Language:English
Published: Dordrecht ; Boston : Kluwer Academic, c1996.
Series:Mathematics and its applications (Kluwer Academic Publishers) v. 341.
Subjects:
Table of Contents:
  • Ch. I. Continuous and Discrete Deterministic Systems
  • Ch. II. Stability of Stochastic Systems
  • Ch. III. Description of Control Problems
  • Ch. IV. The Classical Calculus of Variations and Optimal Control
  • Ch. V. The Maximum Principle
  • Ch. VI. Linear Control Systems
  • Ch. VII. Dynamic Programming Approach. Sufficient Conditions for Optimal Control
  • Ch. VIII. Some Additional Topics of Optimal Control Theory
  • Ch. IX. Control of Stochastic Systems. Problem Statements and Investigation Techniques
  • Ch. X. Optimal Control on a Time Interval of Random Duration
  • Ch. XI. Optimal Estimation of the State of the System
  • Ch. XII. Optimal Control of the Observation Process
  • Ch. XIII. Linear Time-Invariant Control Systems
  • Ch. XIV. Numerical Methods for the Investigation of Nonlinear Control Systems
  • Ch. XV. Numerical Design of Optimal Control Systems.