Undergraduate / Computer Science Major (BS/BA) / CS Core, Breadth, and Depth RequirementsCS Depth: Theory
The courses below fulfill the Depth: Theory requirement in computer science.
This course gives an introduction to the mathematical foundations of computation. The course will look at Turing machines, universal computation, the Church-Turing thesis, the halting problem and general undecidability, Rice’s theorem, the recursion theorem, efficient computation models, time and space (memory) bounds, deterministic and nondeterministic computation and their relationships, the P versus NP problem and hard problems for NP and beyond. Note: This course will replace Math 374 (Theory of Computability and Turing Machines) which is listed as a recommended way to fulfill the undergraduate theory breadth requirement in CS but hasn’t been taught in several years. The Math department is happy to give it up.
Algorithm design and analysis is fundamental to all areas of computer science and gives a rigorous framework for the study optimization. This course provides an introduction to algorithm design through a survey of the common algorithm design paradigms of greedy optimization, divide and conquer, dynamic programming, network flows, reductions, and randomized algorithms. Important themes that will be developed in the course include the algorithmic abstraction-design-analysis process and computational tractability (e.g., NP-completeness).
After a brief introduction to numerical computation issues, the course will continue with a sequence of canonical problem settings (e.g., Intersections; Arrangements/Duality), mostly focusing on the combinatorial aspects of the algorithms and the impact of the data structures. Each part will be casted in respective applications settings (GIS; Motion Planning; etc). The last part of the course will present several potpourri-like topics, e.g., Skeletons/Medial Axis; Davenport-Shinzel sequences.
This course will introduce some of the central topics in computational learning theory, a field which approaches the question "whether machines can learn" from the perspective of theoretical computer science. We will study well defined and rigorous mathematical models of learning where it will be possible to give precise and rigorous analysis of learning problems and algorithms. A big focus of the course will be the computational efficiency of learning in these models. We will develop some provably efficient algorithms and explain why such provable algorithms are unlikely for other models. We will only cover topics which permit a mathematically rigorous analysis and hence mathematical maturity is absolutely necessary. In particular, some familiarity with basic probability (such as linearity of expectation) and basic linear algebra will be necessary. Also, the emphasis of the course will be on proofs, so if you are in this course, you should enjoy proofs and math.
Information measures and their properties: entropy, divergence, mutual information, channel capacity. Shannon's fundamental theorems for data compression and coding for noisy channels. Applications in communications, statistical inference, probability, physics. Prerequisites by course: EECS 302 (Probabilities Systems and Random Signals). Prerequisites by topic: Good understanding of basic probability. (A review of probability theory will be given in Week 1.) This course fulfills the Theory Depth requirement.
Introduction to advanced topics in synthesis and modeling of complex VLSI systems at behavioral and logic level. Topics include resource allocation, resource binding, scheduling, and controller design in high level synthesis, C to hardware compilation flows, logic synthesis, survey of stat-of-the-art in high level and system level design methods and tools.