EECS 395, 495: Introduction to Computational Learning Theory

Quarter Offered

Spring : TuTh 9:30-10:50 ; De


This course will introduce some of the central topics in computational learning theory, a field which approaches the question "whether machines can learn" from the perspective of theoretical computer science. We will study well defined and rigorous mathematical models of learning where it will be possible to give precise and rigorous analysis of learning problems and algorithms. A big focus of the course will be the computational efficiency of learning in these models. We will develop some provably efficient algorithms and explain why such provable algorithms are unlikely for other models.

We will only cover topics which permit a mathematically rigorous analysis and hence mathematical maturity is absolutely necessary. In particular, some familiarity with basic probability (such as linearity of expectation) and basic linear algebra will be necessary. Also, the emphasis of the course will be on proofs, so if you are in this course, you should enjoy proofs and math.

  • This course satisfies the Theory Depth & AI Depth Requirement.

INSTRUCTOR: Prof. Anindya De

PREREQUISITES: EECS 335/EECS 336 AND EECS 349 (or email the instructor if you have done upper level math courses).


  1. Intro to computational learning theory, Basic models (concept classes,...)
  2. Online mistake bound learning model. Online algorithms for learning models (elimination, perceptron, Winnow) -- Discussion of SVM
  3. Uniform convergence + VC dimension
  4. PAC model
  5. Weak vs Strong learning: Boosting
  6. Cryptographic hardness of learning.
  7. Learning with membership queries: Decision trees and DNFs
  8. PAC learning with noisy data, SQ learning.
  9. Discrete Fourier methods in learning.