Decision and Control Systems

Decision and Control Systems.

Control Systems includes identification, modeling, analysis, and control of dynamic systems. Applications range from controlling robotic arms for tele-surgery to modeling and controlling economic markets; from temperature control in our homes and office buildings to control of unmanned vehicles, rockets, and robots. Controls analysis and implementation may require high mathematical sophistication as the models may be infinite-dimensional (partial differential equations, differential equations with delays), stochastic (stochastic processes and differential equations), nonlinear with perturbations, and they may be continuous-time or discrete-time (difference equations and finite-state machines). Many well-known techniques in optimization such as dynamic programming trace their origin to control theory. This area offers an unprecedented range of opportunities in theoretical work, applications, and their intersections.



The suggested list of courses is a recommendation. Graduate students should meet with their advisor to finalize course plans each semester. Detailed course information may be found here.

Required Courses

  • SE 424 State Space Design for Control or ECE 515 Control System Theory & Design
  • SE 520/ECE 528/ME 546 Analysis of Nonlinear Systems


Modeling, Identification, and Control

  • SE 521/AE 555 Multivariable Control Design
  • SE 523 Discrete Event Dynamic Systems
  • SE 524 Data-Based Systems Modeling

Robust and Optimal Control

  • SE 525 Control of Complex Systems
  • AE 556 Robust Control
  • ECE 553 Optimum Control Systems

Adaptive and Stochastic Control

  • ECE 517 Nonlinear & Adaptive Control
  • ECE 555 Control of Stochastic Systems
  • ME 562 Robust Adaptive Control

Control and Optimization

  • IE 510 Applied Nonlinear Programming
  • IE 521 Convex Optimization
  • ECE 490 Introduction to Optimization
  • ECE 580 Optimiz by Vector Space Methds
  • ME 561 Convex Methods in Control