EECS 463: Adaptive Filtering and Estimation

Quarter Offered

Spring : 11-12:20 TuTh ; Honig


Applications of adaptive filtering, autoregressive and moving average processes, linear prediction, lattice filters, Least Mean Square (LMS) algorithm, least squares filtering, Kalman filter, convergence analysis.

COURSE DIRECTOR: Prof. Michael Honig

REQUIRED TEXT: S. Haykin, "Adaptive Filter Theory", Prentice-Hall, 2002.

COURSE GOALS: To provide first-year graduate students with an understanding of adaptive filtering applications, structures, algorithms, and performance.

PREREQUISITES BY COURSES: EECS 359, EECS 395: Probabilistic Systems


  • ITEM 1: Probability and random processes
  • ITEM 2: Frequency-domain (spectral) analysis
  • ITEM 3: Familiarity with z-transforms.


  1. Applications of adaptive filters
  2. Autoregressive and Moving Average processes
  3. Linear prediction and joint process estimation
  4. Lattice filters
  5. Gradient and stochastic gradient (Least Mean Square) algorithms
  6. Least squares filtering
  7. Kalman filter
  8. Convergence analysis

GRADES: A weighted combination of homework, midterm, and final.

COURSE OBJECTIVES:  When a student completes this course, s/he should

be able to:

  1. Compute optimal linear prediction filters from second-order

input statistics.

  1. Design an LMS algorithm to meet convergence and steady-state

performance constraints.

  1. Design an adaptive lattice filter, both for prediction and

joint-process estimation.

  1. Design recursive Least Squares and Kalman filters

for different applications.

  1. Specify convergence and steady-state performance of the

preceding techniques by either analysis or simulation.