Curriculum
  /  
Descriptions
ELEC_ENG 433: Statistical Pattern Recognition


VIEW ALL COURSE TIMES AND SESSIONS

Prerequisites

ELEC_ENG 302 or equivalent.

Description

CATALOG DESCRIPTION: Fundamental and advanced topics in statistical pattern recognition including Bayesian decision theory, Maximum-likelihood and Bayesian estimation, Nonparametric density estimation, Component Analysis and Discriminants, Kernel machines, Feature selection, dimension reduction and embedding, Boosting, Minimum description length, Mixture models and clustering, Spectral clustering, Bayesian network and Hidden Markov models, with the applications to image and video pattern recognition.

REQUIRED TEXT: None

REFERENCE TEXT: R. Duda, P. Hart and D. Stork, Pattern Classification, 2ndEdition, Wiley-Interscience, 2001

COURSE DIRECTOR: Prof. Ying Wu

COURSE GOALS: To gain a profound understanding of the theories, algorithms, and applications of the state-of-the-art of statistical pattern recognition, various mathematical approaches, and the applications to image and video pattern analysis and recognition. This is a research-orientated course.

PREREQUISITES BY COURSES: ELEC_ENG 302 or equivalent.

PREREQUISITES BY TOPIC:

  1. Linear algebra
  2. Probability theory
  3. Signal and Systems
  4. C/C++ or MATLAB

DETAILED COURSE TOPICS:

  1. Introduction to pattern recognition systems and problems.
  2. Bayesian decision theory, Minimum-error-rate classification, Chernoff bound and Bhattacharyya bound, Missing features.
  3. Maximum-likelihood estimation, Bayesian estimation, Sufficient statistics and exponential family, Overfitting, and Expectation-Maximization (EM).
  4. Principal component analysis (PCA), Linear discriminant analysis (LDA), and Independent Component analysis (ICA).
  5. Nonparametric density estimation, Parzen Windows, Mean-shift, Nearest-neighbor classification, Metric learning.
  6. Support vector machines (SVM), Kernel machines, Maximum margin classification, Generalizability, and VC dimension.
  7. Feature selection, Dimension reduction and embedding, ISOMAP, Local linear embedding (LLE), Multidimensional scaling (MDS), Manifold learning.
  8. Boosting, Bagging, Bootstrapping, Cross validation, and Component classifiers.
  9. Bayesian networks, dynamic Bayesian networks, Hidden Markov models, and Markov random fields.
  10. Mixture models and clustering, Spectral clustering, Hierarchical clustering.

GRADES:

  1. Homeworks and labs--- 30%
  2. Final Projects and presentations--- 70%

COURSE OBJECTIVES: 

When a student completes this course, s/he should be able to:

  1. Understand the core theories and algorithms of statistical pattern recognition
  2. Understand the state-of-the-art of statistical pattern recognition,
  3. Perform parametric classifier design,
  4. Perform nonparametric classifier design,
  5. Perform feature selection and dimension reduction,
  6. Perform unsupervised data clustering,
  7. Understand the applications such as face recognition, face detection, object detection, gesture recognition, speech recognition, etc.

COURSE SCHEDULE (TENTATIVE):

Week 1: Intro and Bayesian classification

            L1: Intro (classification/regression/density, feature+selection+PR, OCR, speech, object, action, gesture, generative/discriminative, Bayesian/non-bayesian)

            L2: Bayesian decision, Minimum-error-rate classification, Chernoff bound, Bhattacharyya bound, ML/Bayesian estimation

Week 2: PCA/LDA/ICA

            L3: PCA/LDA

            L4: ICA

Week 3: Nearest Neighbor Classifier

            L5: K-NN classification

            L6: Advanced NN

Week 4: Nonparametric density estimation and clustering

            L7: Parzen window, Mean-shift

            L8: Mixture models, EM, spectral clustering

Week 5: Kernel machines

            L9: SVM

            L10: Kernel machines

Week 6: Dimension reduction and embedding

            L11: ISOMAP

            L12: LLE and MDS

Week 7: Feature selection and Boosting

            L13: Boosting

            L14: Advanced boosting

Week 8: Generative models: BN and HMM

            L15: Bayesian networks

            L16: HMM

Week 9: Generative models: MRF and CRF

            L17: MRF

            L18: CRF

Week 10: (option) PAC learnability and VC dimension

Week 11: (option) Learning Gibbs distributions