CS2750  Machine Learning (ISSP 2170)


Time:  Monday, Wednesday 1:00-2:15pm, 
Location: Sennott Square, Room 5313


Instructor:  Milos Hauskrecht
Computer Science Department
5329 Sennott Square
phone: x4-8845
e-mail: milos at cs pitt edu
office hours: Monday 2:30-4:00pm


TA:  Michael Moeng
Computer Science Department
5802 Sennot Square
phone: (510) 684-7416
e-mail: moeng at cs pitt edu
office hours: Tuesday 11:00-12:00am and 2:00-4:00pm


Announcements !!!!!



Links

Course description
Lectures
Homeworks
Term projects
Matlab



Abstract

The goal of the field of machine learning is to build computer systems that learn from experience and that are capable to adapt to their environments. Learning techniques and methods developed by researchers in this field have been successfully applied to a variety of learning tasks in a broad range of areas, including, for example, text classification, gene discovery, financial forecasting, credit card fraud detection, collaborative filtering, design of adaptive web agents and others.

This introductory machine learning course will give an overview of many models and algorithms used in modern machine learning, including linear models, multi-layer neural networks, support vector machines, density estimation methods, Bayesian belief networks, mixture models, clustering, ensamble methods, and reinforcement learning. The course will give the student the basic ideas and intuition behind these methods, as well as, a more formal understanding of how and why they work. Students will have an opportunity to experiment with machine learning techniques and apply them a selected problem in the context of a term project.

Course syllabus

Prerequisites

Knowledge of matrices and linear algebra (CS 0280), probability (CS 1151), statistics (CS 1000), programming (CS 1501) or equivalent, or the permission of the instructor.



Textbook:

Other very useful books:

Lectures
 
 
Lectures  Topic(s)  Assignments
January 3 Introduction.

Readings: Bishop: Chapter 1

January 11 Designing a learning system

Readings: Lecture notes, Bishop: Chapter 1

January 13 Matlab tutorial Homework 1 ( Data for the assignment )
January 25 Density estimation

Readings: Bishop: Chapter 2

January 27 Density estimation II

Readings: Bishop: Chapter 2

Homework 2 ( Data for the assignment )
February 1 Density estimation III, Linear regression

Readings: Bishop: Section 2.4, Section 3.1. (see also Chapter 1)

February 3 Linear regression

Readings: Bishop: Section 3.1. (see also Chapter 1)

Homework 3 ( Data for the assignment )
February 15 Classification. Logistic regression. Generative classification model.

Readings: Bishop: Section 4.1. - 4.3.

Solutions to homework 1
Solutions to homework 2
February 15 Classification. Generative classification model. GLIMS

Readings: Bishop: Section 4.1. - 4.3.

Homework assignment 4
Data for the assignment
February 22 Evaluation of classifiers. Multilayer neural networks.

Readings: Bishop: Section 5.1, 5.3

February 24 Learning linear classification models. Support vector machines I.

Readings: Bishop: Sections 4.1, 7.1.

Homework assignment 5
Data for the assignment
March 1 Support vector machines II.

Readings: Bishop: Section 7.1.

March 3 Support vector machines for regression.

Readings: Bishop: Section 7.1.

Homework assignment 6
Data for the assignment
March 15 Multiway classification. Decision trees.

Readings: Bishop: 4.3., 14.4.

March 21 Bayesian belief networks.

Readings: Bishop: Section 8.1-2.

March 24 BBN: inference and parameter learning.

Readings: Bishop: Section 8.4.

Homework assignment 7
Data for the assignment
March 29 BBN: structure learning.

Readings: Bishop: Section 8.4.

March 31 BBN: learning with hidden variables and missing values. Expectation maximization.

Readings: Bishop: Section 9.

Homework assignment 8
Data for the assignment
April 5 Clustering

Readings: Bishop: Section 9.

April 7 Clustering

Readings: lecture notes

April 12 Dimensionality reduction

Readings: lecture notes, Bishop: 12.1.(PCA)

April 14 Ensemble methods

Readings: Bishop: 14.5., 14.2.

April 19 Boosting

Readings: Bishop: 14.2-3.

April 21 Term project guidelines

Readings:



Homeworks

The homework assignments will have mostly a character of projects and will require you to implement some of the learning algorithms covered during lectures. Programming assignmets will be implemented in Matlab. See rules for the submission of programs.

The assignments (both written and programming parts) are due at the beginning of the class on the day specified on the assignment. In general, no extensions will be granted.

Collaborations: No collaboration on homework assignments, programs, and exams unless you are specifically instructed to work in groups, is permitted.
 



Term projects

The term project is due at the end of the semester and accounts for a significant portion of your grade. You can choose your own problem topic. You will be asked to write a short proposal for the purpose of approval and feedback. The project must have a distinctive and non-trivial learning or adaptive component. In general, a project may consist of a replication of previously published results, design of new learning methods and their testing, or application of machine learning to a domain or a problem of your interest.



Matlab

Matlab is a mathematical tool for numerical computation and manipulation, with excellent graphing capabilities. It provides a great deal of support and capabilities for things you will need to run Machine Learning experiments. The CSSD at UPitt offers $5 student licenses for Matlab. To obtain the licence please check the following link to the Matlab CSSD page . In addition, Upitt has a number of Matlab licences running on both unix and windows platforms. See the following web page for the details.

Matlab tutorial file.

Other Matlab resources on the web:

Online MATLAB  documentation
Online Mathworks documentation including MATLAB toolboxes


Cheating policy: Cheating and any other anti-intellectual behavior, including giving your work to someone else, will be dealt with severely and will result in the Fail (F) grade. If you feel you may have violated the rules speak to us as soon as possible. Please make sure you read, understand and abide by the Academic Integrity Code for the Faculty and College of Arts and Sciences.

Students With Disabilities:
If you have a disability for which you are or may be requesting an accommodation, you are encouraged to contact both your instructor and Disability Resources and Services, 216 William Pitt Union, (412) 648-7890/(412) 383-7355 (TTY), as early as possible in the term. DRS will verify your disability and determine reasonable accomodations for this course.


Course webpages from Spring 2007, Spring 2004 and Spring 2003



Last updated by Milos on 12/31/2009