CS 3750  Advanced Topics in Machine Learning (ISSP 3535)


Time:  Tuesday, Thursday 9:30 am-10:50 am 
Location: Sennott Square, Room 5313


Instructor:  Milos Hauskrecht
Computer Science Department
5329 Sennott Square
phone: x4-8845
e-mail: milos_at_cs_pitt_edu
office hours: by appointment


Announcements !!!!!



Links

Course description
Lectures
Readings by topic
Paper presentations
Projects



Abstract

The goal of the field of machine learning is to build computer systems that learn from experience and that are capable to adapt to their environments. Learning techniques and methods developed by researchers in this field have been successfully applied to a variety of learning tasks in a broad range of areas, including, for example, text classification, gene discovery, financial forecasting, credit card fraud detection, collaborative filtering, design of adaptive web agents and others.

The objective of the Advances Machine Learning course is to expand on the material covered in the introductory Machine Learning course (CS2750), and focus on the most recent advances in the ML field such as, component analysis, kernel and variational methods. The course will consist of a mix of lectures, paper presentations and discussions. Students will be evaluated based on their participation in discussions, paper presentations and projects.

Prerequisites

CS 2750 Machine Learning , or the permission of the instructor.

Syllabus



Readings:

We will use readings from three books during the course:

In addition we will use conference and journal paper readings that will be distributed electronically or in a hardcopy form. Here is a list of papers sorted by the topics we plan to cover during the course. Not all readings on this list will be used in the class; they are provided in the case you want to explore some topics in greater depth.

Other (useful) books



Lectures
 
 
Lectures  Topic(s) 
August 28 Course Administrivia
Syllabus of the course

Readings:

  • Lecture notes from CS2750
August 30 Density estimation

Readings:

  • Lecture notes from CS2750
September 4 Density estimation II

Readings:

  • M. Jordan. Exponential family of distributions (chapter 8)
  • M. Jordan. Multivariate Gaussian (chapter 13)
September 6 Bayesian belief networks

Readings:

  • M. Jordan. Conditional independence and factorization (chapter 2)
  • M. Jordan. Elimination algorithms (chapter 3)
  • C. Bishop. Pattern recognition and machine learning. (chapter 8)
  • September 11 Graphical models. Inference.

    Readings:

    • M. Jordan. Conditional independence and factorization (chapter 2)
    • M. Jordan. Elimination algorithms (chapter 3)
  • C. Bishop. Pattern recognition and machine learning. (chapter 8)
  • September 13 Graphical models. Inference.

    Readings:

    • M. Jordan. Conditional independence and factorization (chapter 2)
    • M. Jordan. Elimination algorithms (chapter 3)
  • C. Bishop. Pattern recognition and machine learning. (chapter 8)
  • September 18 Graphical models. Inference in clique trees.

    Readings:

    • D. Koller and N. Friedman. Exact inference: Clique trees. Chapter 8
    • C. Bishop. Pattern recognition and machine learning. (chapter 8)
    September 20 Belief propagation.

    Readings:

    • D. Koller and N. Friedman. Exact inference: Clique trees. Chapter 8
    September 25 Approximate inference. Monte Carlo.

    Readings:

    • D. Koller and N. Friedman. Particle-based approximate inference. Chapter 9.
    September 27 Approximate inference. Monte Carlo.

    Readings:

    • D. Koller and N. Friedman. Particle-based approximate inference. Chapter 9.
    October 2 Markov chain Monte Carlo.

    Readings:

    • D. Koller and N. Friedman. Particle-based approximate inference. Chapter 9.
    October 4 Learning Bayesian belief networks.

    Readings:

    October 9 Learning the structure of Bayesian belief networks.

    Readings:

    October 11 Expectation-maximization

    Readings:

    October 16 PCA and SVD (Iyad Batal)

    Readings:

    October 18 Applications of PCA (Cem Akaya)

    Readings:

    October 23 Probabilistic PCA (The Minh Luong)

    Readings:

    October 25 Probabilistic latent semantic analysis. Multinomial PCA. (Shuguang Wang)

    Readings:

    October 30 Component analysis using factorial models. Variational learning. (Milos)

    Readings:

    In the following paper please focus more on the main idea rather than the details of the math behind it:
    November 1 Component analysis using factorial models. (Milos)

    Readings:

    • same as above
    November 6 Exponential family PCA. (Richard Pelikan)

    Readings:

    November 6 Support vector machines. Overview. (Milos)

    Readings:

    November 20 Kernels. (Dave Krebs)

    Readings:

    November 20 Kernels for structured data. (Joshua Albrecht)

    Readings:

    November 27 Clustering. (Milos)

    Readings:

    • Lecture notes for CS 2750.
    • Bishop. Pattern Recognition and Machine Learning. Chapter 9-1-9.3.
    November 29 Spectral Clustering. (Michal Valko)

    Readings:


    Course webpage for CS2750, the introductory Machine Learning course from Spring 2007. It is the prerequisite of CS3750.



    Readings

    Readings will be assigned before the class at which the discussion on the topic covered by the paper takes place. Most of the readings will be electronic, however, some readings will be in the paper form or from the books. See the list of Readings for different topics



    Paper discussions

    Every student is expected to present 1-2 papers in the course of the semester and lead the discussion on the paper. The paper will be distributed to students the same way as other readings. The assignement of the papers will be discussed during the first week of the course.
     



    Projects

    There are no homeworks and exams in this course. However, students will be asked to prepare, submit and present two projects. The first project will be assigned and due in the middle of the semester. The final project (due at the end of the semester) and is more flexible: a student can choose his/her own topic to investigate. You will need to submit a short (one page) proposal for the purpose of approval and feedback for the final project. The final project must have a distinctive and non-trivial learning or adaptive component.



    Last updated by milos on 07/03/2007