Courses
  /  
Descriptions
EECS 396, 496: Machine Learning: Foundations, Applications, and Algorithms

Quarter Offered

Spring : MWF 10-10:50 ; Katsaggelos

Prerequisites

A thorough understanding of Linear Algebra and Vector Calculus (e.g., students should be able to easily compute gradients/Hessians of a multivariate function), as well as basic understanding of the Python or MATLAB/OCTAVE programming environments.

Description

From robotics, speech recognition, and analytics to finance and social network analysis, machine learning has become one of the most useful set of scientific tools of our age. With this course we want to bring interested students and researchers from a wide array of disciplines up to speed on the power and wide applicability of machine learning. The ultimate aim of the course is to equip you with all the modelling and optimization tools you’ll need in order to formulate and solve problems of interest in a machine learning framework. We hope to help build these skills through lectures and reading materials which introduce machine learning in the context of its many applications, as well as by describing in a detailed but user-friendly manner the modern techniques from nonlinear optimization used to solve them. In addition to a well curated collection of reference materials, registered students will receive a draft of a forthcoming manuscript authored by the instructors on machine learning to use as class notes.

  • This course counts towards the AI breadth requirement for both undergraduate and graduate students. Students may receive credit for both this course and EECS 349.
  • Course is crosslisted with Data_Sci 423

REQUIRED TEXT:  J. Watt, R. Borhani, and A. K. Katsaggelos, Machine Learning Refined: Foundations, Algorithms, and Applications,  Cambridge University Press, 2016.

COURSE INSTRUCTOR: Prof. Aggelos Katsaggelos

COURSE OUTLINE:

  1. Introduction
    1. What kinds of things can you build with machine learning tools?
    2. How does machine learning work?  The 5 minute elevator pitch edition
    3. Predictive models – our basic building blocks
    4. Feature design and learning – what makes things distinct?
    5. Numerical optimization – the workhorse of machine learning
  2. Fundamentals of numerical optimization
    1. Calculus defined optimality
    2. Using calculus to build useful algorithms
    3. Gradient descent
    4. Newton’s method
  3. Regression
    1. Linear regression - applications in climate science, feature selection, compression, neuroscience, and marketing
    2. Knowledge-driven feature design for regression
    3. Nonlinear regression
    4. The L-2 regularizer
  4. Classification
    1. The perceptron
    2. Logistic regression/Support Vector Machines
    3. Multiclass classification
    4. Knowledge driven feature design for classification– examples from computer vision (object/face detection and recognition), text mining, and speech recognition
  5. Feature learning
    1. Function approximation and bases of features
    2. Feed-forward neural network bases, deep learning, and kernels
    3. Cross-validation
  6. Special topics
    1. Step length determination for gradient methods
    2. Advanced gradient descent schemes: stochastic gradient descent and momentum
    3. Dimension reduction: K-means clustering and Principal Component Analysis

PROBLEM SETS:

Weekly problem sets will be assigned and graded. Homeworks will be assigned on Fridays and will be due the following Friday. Late homeworks will not be accepted.

EXAMS:

There are no exams in this course.

COURSE GRADE:

Final grades for the course will be based entirely on homework assignment grades. 

EXTRA CREDIT OPPORTUNTIES:

Up to 1 percentage point of extra credit can be earned by the first student to report a particular error found in the class text.  Additional extra credit points will be considered for constructive suggestions for improving the text.