CS 3750  Advanced Topics in Machine Learning (ISSP 3535)


Time:  Monday, Wednesday 4:00-5:20pm 
Location: Sennott Square, Room 5313


Instructor:  Milos Hauskrecht
Computer Science Department
5329 Sennott Square
phone: x4-8845
e-mail: milos@cs.pitt.edu
office hours: Tuesday 2:30-4:00pm, Wednesday 11:00-12:00am


Announcements !!!!!



Links

Course description
Lectures
Readings
Paper presentations
Projects



Abstract

The goal of the field of machine learning is to build computer systems that learn from experience and that are capable to adapt to their environments. Learning techniques and methods developed by researchers in this field have been successfully applied to a variety of learning tasks in a broad range of areas, including, for example, text classification, gene discovery, financial forecasting, credit card fraud detection, collaborative filtering, design of adaptive web agents and others.

The objective of the Advances Machine Learning course is to expand on the material covered in the introductory Machine Learning course (CS2750), and focus on the most recent advances in the ML field such as, kernel and variational methods. The course will consist of a mix of lectures, paper presentations and discussions. Students will be evaluated based on their participation in discussions, paper presentations and projects.

Prerequisites

CS 2750 Machine Learning , or the permission of the instructor.

Syllabus



Readings:

Readings for different topics

The primary book for the course is:

Other books you may find useful:

Lectures
 
 
Lectures  Topic(s) 
August 25 Administrivia and Course Overview.
Syllabus of the course

Readings:

  • Lecture notes from CS2750
August 27 Review of density estimation

Readings:

  • Lecture notes from CS2750
September 3 Review of density estimation and exponential family of distributions

Readings:

  • Lecture notes from CS2750
  • M. Jordan. Eponential family. In Graphical models. Chapter 7.
  • September 8 Multivariate normal distribution.

    Readings:

    • Lecture notes from CS2750.
    • M. Jordan. The Multivariate Gaussian. In Graphical models. Chapter 12.
    September 10 Subset Selection and Regularization. (Brano Kveton)

    Readings:

    • Hastie, Tibshirani and Friedman. Elements of statistical learning. Sections: 3.4, 4.3.1, 5.6., 5.8.
    September 15 PCA and SVD. (Mark Fenner)

    Readings:

    • Lecture notes for CS2750
    • Hastie, Tibshirani and Friedman. Elements of statistical learning. Section 14.5.
    Applications:
    September 17 Overview of Bayesian belief networks (BBNs). (Milos)

    Readings:

    • Lecture notes for CS2750
    • TBA
    September 22 BBN Inference: Junction tree algorithm. (Mihai Rotaru)

    Readings:

    September 24 BBN Inference: Pearl algorithm. (Tomas Singliar)

    Readings:

    • Distributed during the class.
    September 29 BBN Inference: Intorduction to Monte Carlo methods. (Changhe Yuan)

    Readings (general intro to Monte Carlo):

    October 1 BBN Inference: Monte Carlo methods (cont.).

    Readings (general intro to Monte Carlo):

    Monte Carlo for BBNs:
  • A brief summary of existing Importance Sampling Algorithms for Bayesian Networks
  • October 6 Learning probabilistic networks from data (Milos)

    Readings:

    October 8 Expectation-maximization (EM) (Milos)

    Readings:

    Midterm project

    October 13 Latent variable models: Probabilistic PCA (Vahan)

    Readings:

    October 15 Latent variable models: Variational approximations and learning. (Milos)

    Readings (introduction to variational approximations):

    October 20 Latent variable models: Variational learning. (Milos)

    Readings:

    October 22 Latent variable models: Variational Bayesian learning. (Milos)

    Readings:

    October 27 Latent variable models: Variational Bayesian learning. (Milos)

    Readings:

    October 29 Structural EM. (Richard)

    Readings:

    November 3 Midterm projects
    November 5 Support Vector Machines (Milos)

    Readings:

    November 10 Support Vector Machines for Regression (Min)

    Readings:

    November 12 Kernel trick for distances. (Brano)

    Readings:

    November 17 Kernel PCA, ICA (Oleg)

    Readings:

    November 19 Kernels for Structures (Tomas)

    Readings:

    November 24 Ensamble methods: Bagging and Boosting (Milos)

    Readings:

    December 1 Ensamble methods: Boosting the margin (Oleg and Min)

    Readings:

    December 3 Multinomial PCA (Changhe and Mihai)

    Readings:

    December 5 Final projects

    Readings:


    Course webpage for CS2750, the introductory Machine Learning course from Spring 2003. It is the prerequisite of CS3750.



    Readings

    Readings will be assigned before the class at which the discussion on the topic covered by the paper takes place. Most of the readings will be electronic, however, some readings will be in the paper form or from the books. See the list of Readings for different topics



    Paper discussions

    Each student is expected to present 1-2 papers in the course of the semester and lead the discussion on the paper. The paper will be distributed to students the same way as other readings. The assignement of the papers will be discussed in the first week of the course.
     



    Projects

    There are no homeworks and exams in this course. However, students will be asked to prepare, submit and present two projects. The first project will be assigned and due in the middle of the semester. The final project (due at the end of the semester) and is more flexible: a student can choose his/her own topic to investigate. You will need to submit a short (one page) proposal for the purpose of approval and feedback for the final project. The final project must have a distinctive and non-trivial learning or adaptive component.



    Last updated by milos on 08/13/2003