CS1675  Introduction to Machine Learning


Time:  Tuesday, Thursday 9:30pm-10:45pm
Location: Sennott Square (SENSQ), Room 5129


Instructor:  Milos Hauskrecht
Computer Science Department
5329 Sennott Square
phone: x4-8845
e-mail: milos at cs pitt edu
office hours: Wednesday: 9:30am-11:00am, Thursday: 3:00pm-4:30pm

TA:  Jeongmin Lee
Computer Science Department
5324 Sennott Square
phone: 4-8455
e-mail: jlee at cs pitt edu
office hours: Tuesday: 2:00pm-5:00pm. Wednesday: 4:00pm-5:00pm, Thursday: TBA

Recitations


Announcements !!!!!



Links

Course description
Lectures
Homeworks
Matlab



Abstract

The goal of the field of machine learning is to build computer systems that learn from experience and that are capable to adapt to their environments. Learning techniques and methods developed by researchers in this field have been successfully applied to a variety of learning tasks in a broad range of areas, including, for example, text classification, gene discovery, financial forecasting, credit card fraud detection, collaborative filtering, design of adaptive web agents and others.

This introductory machine learning course will give an overview of models and algorithms used in modern machine learning including linear models, multi-layer neural networks, support vector machines, density estimation methods, Bayesian belief networks, clustering, ensemble methods, and reinforcement of learning. The course will give the student the basic ideas and intuition behind these methods, and a more formal understanding of how and why they work. Through homework assignments students will have an opportunity to experiment with many machine learning techniques and apply them to various real-world datasets.

Course syllabus

Prerequisites

STAT 1000, 1100, or 1151 (or equivalent), and CS 1501, or the permission of the instructor.



Textbook:

Additional text: Other ML readings (optional):

Lectures
 
 
Lectures  Topic(s)  Assignments
January 8 Introduction to Machine Learning.

Readings: Bishop: Chapter 1

January 10 Designing a learning system

Readings: Bishop: Chapter 1, Daume: Chapters 1 and 2

January 15 Math for ML: Review

Readings: Bishop: Appendix C

January 17 Designing a learning system II

Readings: Bishop: Chapter 1, Daume: Chapters 1 and 2

Homework assignment 1 ( Data for assignment 1)
January 22 Probabilities: review, Density estimation

Readings:

.
January 24 Density estimation I

Readings: Bishop: Chapter 2

Homework assignment 2 ( Data for assignment 2)
January 29 Density estimation II

Readings: Bishop: Chapter 2

.
January 31 Density estimation III

Readings: Bishop: Chapter 2

Homework assignment 3 ( Data for assignment 3)
February 5 Linear regression

Readings: Bishop: Chapter 3

.
February 7 Linear regression II

Readings: Bishop: Chapter 3

Homework assignment 4 ( Data for assignment 4)
February 12 Linear models for classification

Readings: Bishop: Chapter 4

.
February 14 Generative models for classification

Readings: Bishop: Chapter 3

Homework assignment 5 ( Data for assignment 5)
February 19 Support vector machines

Readings: Bishop: Chapter 7.1

.
February 22 Multilayer Neural Networks

Readings: Bishop: Chapters 5.1-3, 5.5.

Homework assignment 6 ( Data for assignment 6)
February 26 Multilayer Neural Networks
Multiclass classification

Readings: Bishop: Chapters 5.1-3, 5.5.

.
February 28 Multiclass classification + Decision Trees

Readings: Bishop: Chapter 14.4.

.
March 5 Bayesian Belief Networks

Readings: Bishop: Chapter 8.1-4

Homework assignment 7 ( Data/Programs for assignment 7)
March 7, 2019 Midterm exam

Readings: Bishop, Lecture notes, Recitation notes

.
March 19, 2019 Bayesian Belief Networks II (Independences)

Readings: Bishop: Chapter 8.1-4

.
March 21 Bayesian Belief Networks III (Learning and Inference)

Readings: Bishop: Chapter 8.1-4

Homework assignment 8 ( Data/Programs for assignment 8)
March 26 Bayesian Belief Networks (Monte Carlo Inference)

Readings: Bishop: Chapter 8.1-4

.
March 28 Clustering

Readings: Bishop 8.1-2, 8.4.

Homework assignment 9 ( Data/Programs for assignment 9)
April 2 Feature selection, Dimensionality reduction

Readings: Bishop Chapter 12.1.

.
April 4 Learning with multiple models: mixture of experts, and bagging.

Readings: Bishop Chapter 14

Homework assignment 10 ( Data/Programs for assignment 10)
April 9 Learning with multiple classifiers: boosting

Readings: Bishop Chapter 14

.
April 11 Reinforcement learning I

Readings: Kaelbling, Littman, Moore. Reinforcement Learning: a survey

.
April 16 Reinforcement learning II

Readings: Kaelbling, Littman, Moore. Reinforcement Learning: a survey

.
April 18 Reinforcement learning III

Readings: Kaelbling, Littman, Moore. Reinforcement Learning: a survey

.



Homeworks

The homework assignments will consist of a mix of theoretical and programming problems. Programming part will require you to implement in Matlab some of the learning algorithms covered during lectures. See rules for the submission of programs.

The assignments (both theoretical and programming parts) are due at the beginning of the class on the day specified on the assignment. In general, no extensions will be granted.

Collaborations: No collaboration on homework assignments, programs, and exams is permitted unless you are specifically instructed to work in groups.



Matlab

Matlab is a mathematical tool for numerical computation and manipulation, with excellent graphing capabilities. It provides a great deal of support and capabilities for things you will need to run Machine Learning experiments. The CSSD at UPitt offers free student licenses for Matlab. The info about how to get a licence please check the following link to the Matlab CSSD page . Note that in addition, Matlab is available for use in the university computing labs. See the CSSD web page for the details.

Other Matlab resources on the web:

Online MATLAB  documentation
Online Mathworks documentation including MATLAB toolboxes


Grading: Your grade for the course will be determined as follows:


Cheating policy: Cheating and any other anti-intellectual behavior, including giving your work to someone else, will be dealt with severely and will result in the Fail (F) grade. If you feel you may have violated the rules speak to us as soon as possible. Please make sure you read, understand and abide by the Academic Integrity Code for the University of Pittsburgh and School of Computing and Information (SCI).

Students With Disabilities:
If you have a disability for which you are or may be requesting an accommodation, you are encouraged to contact both your instructor and Disability Resources and Services (DRS), 140 William Pitt Union, (412) 648-7890, drsrecep@pitt.edu, (412) 228-5347 for P3 ASL users, as early as possible in the term. DRS will verify your disability and determine reasonable accommodations for this course.



Last updated by Milos on 01/08/2019