Courses
  /  
Descriptions
EECS 496: Seminar in Statistical Language Modeling

Quarter Offered

Winter : 1-3:30 Tu ; Downey

Prerequisites

EECS 349 or EECS 474 or permission of the instructor.

Description

Statistical language models assign probabilities to sequences of words, and are used in systems that perform speech recognition, machine translation, and many other tasks. In recent years, language models based on deep neural networks have dramatically improved the state of the art. This course will cover both the fundamental technologies that comprise statistical language models, such as word embedding methods and recurrent neural networks.  The course will also discuss applications of the techniques to important tasks in artificial intelligence. Students will be required to read and present research papers, and to complete a substantial course project.

TEXTBOOK: None (the course material will be comprised of research papers in the field).

COURSE COORDINATOR: Prof. Doug Downey

COURSE GOALS: The goal of this course is to familiarize graduate students and advanced undergraduates with the current state-of-the-art in statistical language modeling. Students will read recently published papers in the field.

DETAILED COURSE TOPICS:

Recurrent Neural Networks, Long Short-term Memory Networks, Word Embeddings, language generation, image captioning.

ASSIGNMENTS:

  • Leading a paper discussion (30%)
  • Class participation (20%)
  • Project (50%)

COURSE OBJECTIVES: When a student completes this course, s/he should:

  • Have a general understanding of the current state-of-the art in statistical language models.
  • Understand how at least one statistical language model is implemented and can be applied (via the course project).
  • Be able to understand, and think critically about, recent research papers in the field of statistical language modeling.