Nonlinear Control Systems

This series of lectures is intended to provide an introduction to the mathematical theory of nonlinear control systems. It will be accessible to anyone with knowledge of basic calculus and O.D.E. theory.

The main topics covered will be:

1. Definitions and examples of nonlinear control systems

2. Relations with differential inclusions

3. Properties of the set of trajectories

4. Optimal control problems

5. Existence of optimal controls

6. Necessary conditions for optimality: the Pontryagin Maximum Principle

7. Viscosity solutions of Hamilton-Jacobi equations

8. Bellman's Dynamic Programming Principle and sufficient conditions for optimality

The first lecture will start with basic definitions and examples of control systems, showing the essential equivalence between control systems and differential inclusions. Various qualitative properties of the set of trajectories will be discussed, including the so-called "bang-bang" theorem.


Helge Holden <holden@math.ntnu.no>
2001-01-25 15:22