ENSC 883-3: Optimal Control Theory

Course Description

Review of finite dimensional linear systems represented in state space formulation. Bellman's Principle of Optimality and dynamic programming with applications to control of discrete and continuous time systems. Introduction to variational calculus, Pontryagin's Maximum Principle, Hamilton-Jacoby-Bellman Equation, and variational treatment of control problems. Several optimal control problems such as optimal Linear Quadratic Regulator (LQR), optimal tracking and suboptimal output controllers will be discussed.

Prerequisites

Successful completion of <a href=/undergrad/courses/ENSC483.html>ENSC 483-4</a> and <a href=/grad/courses/ENSC801.html>ENSC 801-3</a> is required for students wishing to take this course.

Additional Information for ENSC883