Exam summary: Dynamic Programming and Optimal Control

In the autumn semester of 2018 I took the course Dynamic Programming and Optimal Control. The summary I took with me to the exam is available here in PDF format as well as in LaTeX format. Here’s an overview of the topics the course covered:

  1. Introduction to Dynamic Programming
    1. Problem statement
    2. Open-loop and Closed-loop control
    3. Discrete-state and finite-state problems
  2. The Dynamic Programming Algorithm (DPA)
    1. The standard problem formulation
    2. The principle of optimality
    3. The Dynamic Programming Algorithm
    4. Is expected value a good choice for the cost?
  3. The Dynamic Programming Algorithm (cont’d)
    1. Converting non-standard problems to standard form
  4. Infinite horizon problems
    1. The Stochastic Shortest Path (SSP) problem
    2. Theorem: SSP and the Bellman Equation (BE)
  5. Solving the Bellman Equation
    1. Value Iteration (VI)
    2. Policy Iteration (PI)
    3. Analogy and comparison between VI and PI
    4. Variants of VI and PI
    5. Connections to Linear Algebra
    6. Linear Programming (LP)
  6. Discounted Problems
  7. Shortest Path problems and Deterministic Finite State systems
    1. The Shortest Path (SP) problem
    2. The Deterministic Finite State (DFS) problem
    3. Equivalence of SP and DFS
    4. Hidden Markov Models (HMM) and the Viterbi algorithm
  8. Shortest Path algorithms
    1. Label-correcting methods (LCA)
  9. Deterministic continuous-time optimal control and the Hamilton-Jacobi-Bellman equation
    1. The Hamilton-Jacobi-Bellman (HJB) equation
  10. Pontryagin’s minimum principle
    1. Notation
    2. The minimum principle
    3. Fixed terminal state
    4. Free initial state
    5. Free terminal time
    6. Time-varying system and cost
    7. Singular problems