Academics
  /  
Courses
  /  
Descriptions
COMP_SCI 496: Topics in Theoretical Machine Learning


VIEW ALL COURSE TIMES AND SESSIONS

Prerequisites

You must have a solid background in multivariate calculus, linear algebra, basic probability, and algorithms. You must have general mathematical maturity and be comfortable with mathematical writing (e.g., mathematical arguments, derivations, and proofs). This is a theory course targeted at PhD students, so you will be expected to understand and produce mathematical arguments and proofs. It is highly recommended that you have taken Graduate Algorithms (ELEC_ENG/COMP_ENG 495/496) or Computational Learning Theory (ELEC_ENG/COMP_ENG 395) or similar courses. Undergraduate students and masters students should get the permission from the course instructor before registering for the course.

Description

This course is an advanced graduate-level/seminar course on some topics in theoretical machine learning. The course is ideal for graduate students and senior undergraduates who are theoretically inclined and want to know more about related research challenges in the field of machine learning. The goal in this course is to learn tools and techniques to design algorithms with provable guarantees for some basic tasks in unsupervised learning. In this course, you will see topics like learning probabilistic models, high-dimensional data analysis, clustering, mixture models,  matrix completion, dictionary learning etc., and encounter tools like spectral methods, tensor decompositions, convex relaxations and non-convex optimization.

INSTRUCTOR: Prof. Aravindan Vijayaraghavan

READINGS AND WORK:

Students in this seminar will read and present research papers on topics covered in this course. Readings will be assigned from notes, books, and research papers available on the web.

SIMILAR COURSES:

Ankur Moitra's course at MIT: http://people.csail.mit.edu/moitra/408.html
Daniel Hsu's course at Columbia: http://www.cs.columbia.edu/~djhsu/coms4772-f16/about.html