Time: Tuesday, Thursday
1:00pm-2:15pm
Location: Sennott Square, Room 5313
Instructor: Milos
Hauskrecht
Computer Science Department
5329 Sennott Square
phone: x4-8845
e-mail: milos at cs pitt edu
office hours: Tuesday 11:00am-12:30pm, Thursday 3:00-4:30pm
TA: Yanbing Xue
Computer Science Department
5324 Sennott Square
phone:
e-mail: yax14 at pitt edu
office hours: Tuesday 3:30-5:00pm, Wednesday 10:00am-11:30am
Course description
Lectures
Homeworks
Term projects
Matlab
The goal of the field of machine learning is to build computer systems that learn from experience and that are capable to adapt to their environments. Learning techniques and methods developed by researchers in this field have been successfully applied to a variety of learning tasks in a broad range of areas, including, for example, text classification, gene discovery, financial forecasting, credit card fraud detection, collaborative filtering, design of adaptive web agents and others.
This introductory machine learning course will give an overview of models and algorithms used in machine learning, including linear regression and classification models, multi-layer neural networks, support vector machines, Bayesian belief networks, mixture models, clustering, ensemble methods, and reinforcement learning. The course will give the student the basic ideas and intuition behind these methods, as well as, a more formal understanding of how and why they work. Students will have an opportunity to experiment with machine learning solutions on various datasets during homework assignments and in context of a term project.
Prerequisites
Knowledge of calculus, linear algebra, probability (CS 1151), statistics (CS 1000), and programming (CS 1501) or equivalent, or the permission of the instructor.
Lectures | Topic(s) | Assignments | |
---|---|---|---|
January 8 |
Introduction to Machine Learning.
Readings: Bishop: Chapter 1 |
. | |
January 10 |
Designing a learning system
Readings: Bishop: Chapter 1, Daume: Chapters 1 and 2 |
. | |
January 15 |
Matlab Tutorial
Readings: Matlab tutorial files |
. | |
January 17 |
Designing a learning system II
Readings: Bishop: Bishop: Chapter 1, Daume: Chapters 1 and 2 |
Homework assignment 1 ( Data for the assignment 1) | |
January 22 |
Density estimation I
Readings: Bishop: Chapter 2 |
. | |
January 24 |
Density estimation II
Readings: Bishop: Chapter 2 |
Homework assignment 2 ( Data for the assignment 2) | |
January 29 |
Density estimation III
Readings: Bishop: Chapter 2 |
. | |
January 31 |
Linear regression
Readings: Bishop: Chapter 3 |
Homework assignment 3 ( Data/Programs for the assignment 3) | |
February 5 |
Linear models for classification
Readings: Bishop: Chapter 4 |
. | |
February 7 |
Classification models II
Readings: Bishop: Chapter 4 |
Homework assignment 4 ( Data/Programs for assignment 4) | |
February 12 |
Evaluation of classifiers Support vector machines I Readings: Bishop: Chapter 4, Chapter 7.1. |
. | |
February 14 |
Support Vector Machines II
Readings: Bishop: Chapter 7.1. |
Homework assignment 5 ( Data/Programs for assignment 5) | |
February 19 |
Multilayer Neural Networks
Readings: Bishop: Chapter 5.1-3, 5.5. |
. | |
February 21 |
Multiclass classification + Decision trees
Readings: Bishop: Chapter 14.4. |
Homework assignment 6 ( Data/Programs for assignment 6) | |
February 26 |
Bayesian Belief Networks I (basics)
Readings: Bishop: Chapters 8.1-2 |
. | |
February 28 |
Bayesian Belief Networks II (independences)
Readings: Bishop: Chapters 8.1-2 |
. | |
March 5, 2019 |
Bayesian Belief Networks III (learning, inference)
Readings: Bishop: Chapters 8.1-4 |
Homework assignment 7 ( Data/Programs for assignment 7) | |
March 7, 2019 | Midterm exam
Readings: Bishop, Lecture notes |
. | |
March 19, 2019 |
Bayesian Belief Networks IV
Readings: Bishop: Chapters 8.1-4 |
. | |
March 21, 2019 |
Expectation-Maximization. Mixture of Gaussians.
Readings: Bishop: Chapter 9.2 |
Homework assignment 8 ( Data/Programs for assignment 8) | |
March 26, 2019 |
Clustering
Readings: Chapter 9.1. |
. | |
March 28, 2019 |
Clustering
Readings: Bishop Chapter 9.1. |
Homework assignment 9 ( Data/Programs for assignment 9) | |
April 2 |
Feature selection, Dimensionality reduction
Readings: Bishop Chapter 12.1. |
. | |
April 4 |
Learning with multiple models: mixture of experts, bagging, and boosting
Readings: Bishop Chapter 14 |
Homework assignment 10 ( Data/Programs for assignment 10) | |
April 9 |
Reinforcement learning I
Readings: |
. | |
April 11 |
Reinforcement learning II
Readings: |
. | |
April 16 | Exam II. | . | |
April 23 | Project presentations I | . | |
April 25 | Project presentations II | . |
The homework assignments will consist of a mix of theoretical and programming problems Programming assignments will require you to implement in Matlab some of the learning algorithms covered during the lectures and experiment with them on various real-world datasets. See rules for the submission of programs.
The assignment reports and programs should be submitted electronically via Course web. The assignments are due at the beginning of the class on the day specified on the assignment. In general, no extensions will be granted.
Collaborations:
No collaboration on homework assignments, programs, and exams is permitted unless you are specifically instructed to work in groups.
The term project is due at the end of the semester and accounts for a significant portion of your grade.
Matlab is a mathematical tool for numerical computation and manipulation, with excellent graphing capabilities. It provides a great deal of support and capabilities for things you will need to run Machine Learning experiments. The CSSD at UPitt offers free student Matlab licenses. To obtain the licence please check the following link to the Matlab CSSD page . In addition, Upitt has a number of Matlab licences running on both unix and windows platforms.
Other Matlab resources on the web:
Online
MATLAB documentation
Online
Mathworks documentation including MATLAB toolboxes
Cheating policy: Cheating and any other anti-intellectual behavior, including giving your work to someone else, will be dealt with severely and will result in the Fail (F) grade. If you feel you may have violated the rules speak to us as soon as possible. Please make sure you read, understand and abide by the Academic Integrity Code for School of Computing and Information (SCI) .
Students With Disabilities:
If you have a disability for which you are or may be requesting an
accommodation, you are encouraged to contact both your instructor and
Disability Resources and Services, 216 William Pitt Union, (412)
648-7890 as early as possible in the term. DRS
will verify your disability and determine reasonable accommodations for
this course.
Course webpage from Spring 2018