PENN CIS 620, FALL 2008: COMPUTATIONAL LEARNING THEORY
Prof. Michael Kearns
and
Dr. Koby Crammer
mkearns@cis.upenn.edu,
crammer@cis.upenn.edu
Tuesdays
12-3 PM (First meeting Sep 9)
315 Levine Hall (note change of room)
URL for this page:
www.cis.upenn.edu/~mkearns/teaching/COLT
COURSE DESCRIPTION
This course is an introduction to Computational Learning Theory, a field which attempts to provide algorithmic, complexity-theoretic and statistical foundations to modern machine learning.
The first part of the course will closely follow portions of An Introduction to Computational Learning Theory, by M. Kearns and U. Vazirani (MIT Press). We will cover perhaps 6 or 7 of the chapters in K&V over (approximately) the first half of the course, often supplementing with additional readings and materials. Copies of K&V will be available at the Penn bookstore. The second portion of the course will focus on a number of more recent but still fundamental topics in learning theory not covered in K&V.
The course will give a broad overview of the kinds of problems and techniques typically studied in Computational Learning Theory, and provide a basic arsenal of powerful mathematical tools for analyzing machine learning problems.
Topics to be covered include:
Notes on related courses:
COURSE FORMAT, REQUIREMENTS, AND PREREQUISITES
Much of the course will be in fairly traditional "chalk talk" lecture format, but with ample opportunity for discussion, participation, and critique. The course will meet once a week on Mondays from 12 to 3. Lunch will be served.
There are no official prerequisites, but participants should be comfortable with the basics of the design and analysis of algorithms, computational complexity, statistics and probability theory. The level of the material is such that the typical graduate student in computer science or statistics would have an appropriate background; strong undergraduates are also encouraged to join. Auditors and occasional participants are welcome.
The course requirements for registered students are active in-class participation, an occasional problem set, possibly leading a class discussion, and a final project.
DETAILED MEETING/TOPIC SCHEDULE
Tue Sep 9
READING: K&V Chapter 1.
Tue Sep 16
READING: K&V Chapters 1 and 2.
HOMEWORK #1, DUE AS HARDCOPY IN CLASS SEP 23: Solve the following
problems from K&V: 1.1, 1.3, 2.3, 2.4. You are free to collaborate on the homework
assignments, but please turn in separate write-ups and acknowledge your collaborations.
Please do not attempt to look up solutions in the literature.
Tue Sep 23
READING: K&V Chapters 2 and 3.
Tue Sep 30
READING: K&V Chapter 3.
Tue Oct 7
READING: Supplementary readings:
Tue Oct 14
Tue Oct 21
READING: K&V Chapter 5; supplementary reading:
HOMEWORK #2, DUE AS HARDCOPY IN CLASS NOV 7: Solve the following
problems from K&V: 3.1, 3.5, 5.1, 5.3.
You are free to collaborate on the homework
assignments, but please turn in separate write-ups and acknowledge your collaborations.
Please do not attempt to look up solutions in the literature.
Tue Oct 28
READING: K&V Chapter 4; supplementary readings.
CHANGE OF DATE: Tue Nov 4 ===> Fri Nov 7
READING: K&V Chapters 6,7,8.
Tue Nov 11
READING: Supplementary readings.
INFORMATION ON COURSE PROJECTS.
The deadline for final projects will be
December 20,
via email to Prof Kearns.
Tue Nov 18
READING: Supplementary readings.
Tue Nov 25
READING: Supplementary readings.
Tue Dec 2
READING: Supplementary readings.
The exact timing and set of topics below will depend on our progress and will be updated
as we proceed. "[MK]" indicates lecture by Prof. Kearns; "[KC]" by Dr. Crammer.
[MK] In the first meeting, we will go over course mechanics and present a course overview, then
immediately begin investigating the Probably Approximately Correct (PAC) model of learning.
[MK] Introduction to PAC learning, continued; Occam's Razor, sample complexity and learning curves.
[MK] Occam's Razor, continued; VC dimension.
[MK] VC dimension, continued; Structural Risk Minimization; generalizations.
[KC] Better generalization bounds: Rademacher complexity, PAC-Bayes.
Fall break, no class.
[MK] Learning with noise and errors; SQ learning.
[KC] Boosting
[MK] In honor of Election Day, this week's class has been MOVED
from Tuesday to Friday, Nov 7 from 12-2PM in Room 337 Moore (we only have
the room for two hours). We will cover a complementary pair of results,
first showing that finite state automata cannot be learned in the PAC
model (under standard cryptographic assumptions), then showing that they
can be learned if the model is augmented with membership queries.
HW2 will be due in class Nov 7.
[KC] Online and no-regret learning.
For those of you registered for credit in the course,
your final assignment will be a course project, for which you have three choices:
[KC] Online and no-regret learning, continued.
[MK] COLT methods in reinforcement learning.
[MK] Topic TBD.