Deep Learning Theory (CS 540).

Essential info.

Schedule.

Schedule will be continuously updated.

Lecture notes (html, pdf) are also continuously updated: all material we have not yet covered is subject to major revision!

Date Topic Assignments


8/24
Course introduction.
Approximation: intro.
Notes section 1, tablet notes 1.


8/26
Folklore multivariate approximation;
classical universal approximation.
Notes section 2, tablet notes 2.


8/31
Classical universal approximation;
Infinite-width approximation and Barron norms..
Notes sections 2-3, tablet notes 3.


(8/31) hw1 tex, pdf.

9/2
Infinite-width approximation and Barron norms.
Notes section 3, tablet notes 4.


9/7
9/9
9/14
Approximation near initialization and the NTK.
Notes section 4,
tablet notes 5,
tablet notes 6,
tablet notes 7.



9/14
9/16
9/21
Benefits of depth:
approximation lower bounds; Sobolev ball approximation
Notes section 5,
tablet notes 7,
tablet notes 8,
tablet notes 9.





(9/22) hw1 due.



9/23
9/28
9/30
Optimization: intro;
semi-classical convex optimization.
Notes sections 6-7,
tablet notes 10,
tablet notes 11,
tablet notes 12.
(9/22) hw2 tex, pdf.


9/30
10/5
10/7
Strong-convexity style NTK optimization proof
Notes section 8.1,
tablet notes 12,
tablet notes 13,
tablet notes 14.


10/12
10/14
10/19
10/21
Nonsmoothness, positive homogeneity, and margin maximization.
Notes sections 9&10,
tablet notes 15,
tablet notes 16,




(10/20) hw2 due / hw 3 out.
10/26
10/28
11/2
11/4
11/9
11/11
11/16
11/18
Generalization: VC, Rademacher, and covering number bounds for deep networks. (Interpolation?) (?/?) project info released?





(11/17) hw3 due / hw4 out.
11/30
12/2
12/7
Extra topics? Interpolation? Kolmogorov-Arnold? Smoothness-based NTK analysis? Requests? (?/?) project due?


(12/15) hw4 due.

Homework policies.

Project information.

This will be a paper-reading project, due near the end of the semester, also worth 20% of total course grade. You’ll probably have two hand-ins: something preliminary for us to discuss which paper you cover, and then at the end some succinct (2-3 page) writeup. Details will be posted before the end of October.

Further readings and resources.

(These are out of date, please feel free to ping me to update them.)

Learning theory classes (not specifically deep learning).

Textbooks and surveys. Again, there are many others, but here are a key few.