PENN CIS 6250, FALL 2024: THEORY OF MACHINE LEARNING
Prof. Michael Kearns
mkearns@cis.upenn.edu
Time:
Tuesdays and Thursdays 10:15-11:45
Location: MacNeil 286-7
Attendance at lectures is a course requirement.
MK Office Hours: Right after class on Thursdays or by appointment. Weather permitting, I'll hold OHs
in the outdoor seating area near MacNeil, otherwise in one of my offices. Please let me know in advance
if you intend to come to OHs.
Teaching Assistants:
Dominik Kau:
dominikk@seas.upenn.edu
Office Hours:
Fridays, 1-2:30 pm in Levine 501 bump space
James Wang:
jwang541@seas.upenn.edu
Office Hours:
Tuesdays, 2-4 pm in Levine 612
URL for this page:
www.cis.upenn.edu/~mkearns/teaching/CIS625
Previous incarnations of this course:
www.cis.upenn.edu/~mkearns/teaching/CIS625/cis6250-23.html
www.cis.upenn.edu/~mkearns/teaching/CIS625/cis6250-22.html
www.cis.upenn.edu/~mkearns/teaching/CIS625/cis625-21.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt18.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt17.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt16.html
www.cis.upenn.edu/~mkearns/teaching/COLT/colt15.html (with Grigory Yaroslavtsev)
www.cis.upenn.edu/~mkearns/teaching/COLT/colt12.html (with Jake Abernethy)
www.cis.upenn.edu/~mkearns/teaching/COLT/colt08.html (with Koby Crammer)
COURSE DESCRIPTION
This course is an introduction to the theory of machine learning, and provides mathematical, algorithmic, complexity-theoretic and probabilistic/statistical foundations to modern machine learning and related topics.
The first part of the course will follow portions of An Introduction to Computational Learning Theory, by M. Kearns and U. Vazirani (MIT Press). We will cover perhaps 6 or 7 of the chapters in K&V over (approximately) the first half of the course, often supplementing with additional readings and materials. I will provide electronic versions of the relevant chapters of K&V.
The second portion of the course will focus on a number of models and topics in learning theory and related areas not covered in K&V.
The course will give a broad overview of the kinds of theoretical problems and techniques typically studied and used in machine learning, and provide a basic arsenal of powerful mathematical tools for analyzing machine learning problems.
Topics likely to be covered include:
COURSE FORMAT, REQUIREMENTS, AND PREREQUISITES
The lectures will be in a mixture of "chalk talk" format and markup of lecture notes, but with ample opportunity for discussion, participation, and critique. The course will meet Tuesdays and Thursdays from 10:15 to 11:45, with the first meeting on Tuesday August 27.
The course will involve advanced mathematical material and will cover formal proofs in detail, and will be taught at the doctoral level. While there are no specific formal prerequisites, background or courses in algorithms, complexity theory, discrete math, combinatorics, convex optimization, probability theory and statistics will prove helpful, as will "mathematical maturity" in general. We will be examining detailed proofs throughout the course. If you have questions about the desired background, please ask. Auditors and occasional participants are welcome.
The course requirements for registered students will be a mixture of active in-class participation, problem sets, possibly leading a class discussion, and a final project. The final projects can range from actual research work, to a literature survey, to solving some additional problems.
MEETING/TOPIC SCHEDULE
The exact timing and set of topics below will depend on our progress and will be updated
as we proceed.
Module 1
Module 2
Module 3
"Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications",
D. Haussler 1992.
Module 4
Module 5
Module 6
Module 7
Module 8
Course overview, topics, and mechanics. Analysis of rectangle learning problem.
Introduction of the PAC learning model. PAC learning conjunctions. Intractability of
PAC learning 3-term DNF. PAC learning 3-term DNF by 3CNF.
Consistency and PAC learning in the finite H case.
Consistency and compression.
Consistency and PAC learning in the infinite H case. The VC dimension. Uniform convergence.
Boosting
PAC Learning with Classification Noise (CN); the Statistical Query (SQ) Model
No-Regret Learning and Game Theory
Fairness in Machine Learning
Differential Privacy and Machine Learning