Overview
Course description: The course will cover the following topics: learning basics, unsupervised learning, supervised learning, classification, regression, clustering, dimensionality reduction, nearest neighbor classification, support vector machines, neural networks, density estimation, Bayesian belief networks, Hidden Markov models, expectation maximization, decision trees and ensembles. There will be homework assignments, two exams and a final project.Prerequisites: CS1501, MATH 0280, STAT 1151, STAT 1000. The expectation is that you can program and analyze the efficiency and performance of programs. You should also be able to compute derivatives of functions. Further, some experience with linear algebra (matrix and vector operations) and probability is expected.
Piazza: Sign up for it here. Note that we will use Piazza for two main purposes: (1) for announcements, and (2) for classmate-to-classmates discussion of homework problems, etc. The instructor will monitor Piazza infrequently. The time when you should ask the instructor or TA questions is during office hours.
Programming languages: For homework assignments, you can use Matlab or Python. For the course project, you can use any language of your choice.
Textbooks: We will have required readings from two textbooks. In some cases, the readings will be overlapping, but it helps to read two phrasings of the same idea. In other cases, one reading is more complete or sometimes more intuitive than the other.
- Christopher M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006.
- Kevin P. Murphy. Machine Learning: A Probabilistic Perspective. MIT Press, 2012. (full text available online through the Pitt library)
- Trevor Hastie, Robert Tibshirani, and Jerome Friedman. The Elements of Statistical Learning. Springer, 2009. (available online on the second author's page)
- David Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press, 2012. (available online on the author's page)
Policies
Grading
Grading will be based on the following components:- Homework assignments (4 assignments x 10% each = 40%)
- Course project (25%)
- Midterm and final exam (15% midterm + 15% final = 30%)
- Participation (5%)
Homework Submission Mechanics
You will submit your homework using CourseWeb. Navigate to the CourseWeb page for CS2770, then click on "Assignments" (on the left) and the corresponding homework ID. Your written answers should be a single .pdf/.doc/.docx file. Your source code should be a single zip file (also including images/results if requested). Name the file YourFirstName_YourLastName.[extension]. Please comment your code! Homework is due at 11:59pm on the due date. Grades will be posted on CourseWeb.Exams
There will be one in-class midterm exam, and a final exam which will focus on material from the latter part of the course. There will be no make-up exams unless you or a close relative is seriously ill!Participation
Students are expected to regularly attend the class lectures, and should actively engage in in-class discussions. Attendance will not be taken, but keep in mind that if you don't attend, you cannot participate. You can actively participate by, for example, responding to the instructor's or others' questions, asking questions or making meaningful remarks and comments about the lecture, and answering others' questions on Piazza. You are also encouraged to bring in relevant articles you saw in the news.Late Policy
On your programming assignments only, you get 3 "free" late days counted in minutes, i.e., you can submit a total of 72 hours late. For example, you can submit one homework 12 hours late, and another 60 hours late. Once you've used up your free late days, you will incur a penalty of 25% from the total assignment credit possible for each late day. A late day is anything from 1 minute to 24 hours. Note this policy does not apply to components of the project.Collaboration Policy and Academic Honesty
You will do your work (exams and homework) individually. The only exception is the project, which can be done in pairs. The work you turn in must be your own work. You are allowed to discuss the assignments with your classmates, but do not look at code they might have written for the assignments, or at their written answers. You are also not allowed to search for code on the internet, use solutions posted online unless you are explicitly allowed to look at those, or to use Matlab's or Python's implementation if you are asked to write your own code. When in doubt about what you can or cannot use, ask the instructor! Plagiarism will cause you to fail the class and receive disciplinary penalty. Please consult the University Guidelines on Academic Integrity.Note on Disabilities
If you have a disability for which you are or may be requesting an accommodation, you are encouraged to contact both your instructor and Disability Resources and Services (DRS), 140 William Pitt Union, (412) 648-7890, drsrecep@pitt.edu, (412) 228-5347 for P3 ASL users, as early as possible in the term. DRS will verify your disability and determine reasonable accommodations for this course.Note on Medical Conditions
If you have a medical condition which will prevent you from doing a certain assignment, you must inform the instructor of this before the deadline. You must then submit documentation of your condition within a week of the assignment deadline.Statement on Classroom Recording
To ensure the free and open discussion of ideas, students may not record classroom lectures, discussion and/or activities without the advance written permission of the instructor, and any such recording properly approved in advance can be used solely for the student's own private use.[top]
Project
A project can be:- Type A: design of a new method for an existing problem or an application of techniques we studied in class (or another method) to a new problem that we have not discussed in class
- Type B: experimental comparison of a number of existing techniques on a known problem and detailed discussion and analysis of the results
- Type C: an extensive literature review and analysis on one of the topics covered in class
- Proposal -- Not for a grade but should be submitted. Aim for at least 2 pages. The better thought-out this is, the more feedback the instructor can give. Also think about what data and/or code you will use.
- Draft (5% of final grade) -- This should be like a final version of your final report, and should have all sections that your final report would have (although some of them will be incomplete yet), showing as much progress as you can. The expectation is that at this point you have done 1/3 or 1/2 of the required work for this project.
- Presentation (5% of final grade) -- Aim to be clear, enthusiastic and concise. You need to submit (on CourseWeb) the presentation file on the day of your presentation for the instructor's reference.
- Final report (15% of final grade) -- Use some existing conference template. The final report should resemble a conference paper and should include (as applicable) clear problem definition and argumentation of why this problem is important, overview of related work, detailed explanation of the approach, well-motivated experimental evaluation, including setup description, and a description of what each team member did.
- Students are encouraged to work in groups of two for their final project. The only exception is the literature review, which can only be done by students working individually.
- The project should include some amount of novelty.
- You are encouraged to use any external expertise you might have (e.g. biology, physics, etc.) so that your project makes the best use of areas you know well, and is as interesting as possible.
- Combining your final project for this class and another class is generally permitted, but the project proposal and final report should clearly outline what part of the work was done to get credit in this class, and the instructor should approve the proposed breakdown of work between this and another class.
- The final report should be self-contained, i.e. the instructor should not have to read any other papers to understand what you did.
- All project written items are due at 11:59pm on CourseWeb.
- The project should include some amount of novelty. For example, you cannot just re-implement an existing paper or project. You should come up with a new method, or apply an existing method for a new problem.
- Do not rely on data collection to be the novel component of your work. If you are proposing to tackle a new problem, you might need to collect data, but while this is a contribution, it will not be enough to earn a good project grade. You still have to come up with a solid method idea, i.e. your project has to have sufficient technical novelty.
- You must show that your method is in some sense better (quantitatively) than at least some relatively recent existing methods. For example, you can show that your method achieves superior accuracy in some prediction task compared to prior methods, or that it achieves comparable accuracy but is faster. This outcome is not guaranteed to come out the way you intended during the limited timespan of a course project, so whether or not your outperform the state of the art will only be a small component of your grade. Further, if you propose a sufficiently interesting method, rather than an extremely simple method, it will be less of a problem if your method does not outperform other existing approaches to the problem.
- Each of the following components will be graded: how well you introduced and motivated the problem in your presentation and final report; how well you researched and presented the relevant work in the area you are tackling; how technically solid and novel your method is; how well you experimentally tested your method, and analytically discussed your experimental findings; how well you were able to draw conclusions from your work and discuss potential future work to further improve on the problem you proposed to tackle.
- You are allowed to use existing code for known methods, but again, notice that your project is expected to be a significant amount of work and not just a straight-up run of some package.
- This type of project has the highest chance of turning into a published workshop or conference paper.
- Even for this type of project, you should present a very brief literature review during your presentation, so your classmates know the space in which you are working.
- A good source for learning about what work has been done in your domain of interest are search engines, Google Scholar, and arxiv.org.
- What you are proposing to do should not already have been done in another published paper (including papers on arxiv.org).
- You have to properly introduce and motivate the problem you chose to study (i.e. why it is important, and why it is challenging).
- For experimental comparisons, you still need to present a detailed literature review for the topic at hand. You must review and include detailed descriptions (in your final report) of at least 10 papers. If code is not available for most of the papers you chose to implement, you need to experimentally compare at least 3 papers. In your implementation of papers without code, you do not have to follow the papers in every detail, but your implementation should be faithful to the paper you are implementing "in spirit". You should implement (rather than use existing code) for at least one of the methods you compare against; e.g. you might use code for 3 papers and implement 1 additional paper. Make sure to include a careful justification why these 3 are the ones you chose to implement. Make sure to include a detailed analysis of the strengths and weaknesses of each paper you chose to compare, based on both the published papers, as well as the experimental findings you collected over the course of the project.
- For literature reviews, your final report should include at least 20 references. It should show a sensible organization of these references, and at least one paragraph containing details about each paper, including at least 5+ sentences describing the method in each of the referenced works. Make sure to describe both the technical details, and the experimental techniques used in each of the papers you present. Make sure to discuss some strengths and weaknesses of each paper you include in your review. Also include a synthesis/summary of what has been accomplished in the community on the problem you chose to study, grouped by the themes of the papers, and what future work might be.
- Literature reviews can only be done in teams of one.
- For computer vision project ideas, you can look at the list of datasets and tasks below for inspiration, or read some paper abstracts on this page.
- For NLP project ideas, see this page from Christopher Manning.
- Also look at the following list of project suggestions from Ray Mooney (but please do NOT contact any of the contacts given), this one from Carlos Guestrin, this one from Andreas Krause, and this one from Andrew Ng.
Schedule
Date | Chapter | Topic | Readings | Lecture slides | Due |
1/5 | Intro | Introduction | Murphy Ch. 1, Bishop Sec. 1.3 | pptx pdf | |
1/10 | Linear algebra and Matlab | pptx pdf | |||
1/12 | |||||
1/17 | Unsupervised learning | Clustering | Bishop Sec. 9.1, Murphy Sec. 11.4.2.5, 25.1, 25.5.1 | pptx pdf | |
1/19 | HW1 out | ||||
1/24 | Dimensionality reduction | Bishop Sec. 12.1, Murphy Sec. 12.2 | pptx pdf notes |
||
1/26 | Regression | Line fitting + bias-variance | Bishop Sec. 1.1, 3.2 | pptx pdf | |
1/31 | Linear regression | Bishop Sec. 3.1, Murphy Sec. 7.1-7.5, Sec. 13.3 | pptx pdf notes | ||
2/2 | Classification: intro and linear models | Nearest neighbors | Bishop Sec. 2.5.2, 1.4 | pptx pdf | |
2/7 | Linear models for classification | Bishop Sec. 1.5, 4.1-3, Murphy Ch. 8 | pptx pdf notes | ||
2/9 | HW1 due, HW2 out | ||||
2/14 | |||||
2/16 | Support vector machines | Bishop Sec. 6.1-2, 7.1, Murphy Sec. 14.5 | pptx pdf notes | ||
2/21 | |||||
2/23 | Midterm exam | ||||
2/28 | Classification: non-linear models | Neural networks | Bishop Sec. 5.1, 5.2 (skip 5.2.2-3), 5.3.1-3, 5.5.2,3,6, Murphy Sec. 16.5 | pptx pdf | |
3/2 | Convolutional neural networks | Karpathy's notes, Module 2 | pptx pdf | HW2 due; HW3 out | |
3/7 | Spring break (no class) | ||||
3/9 | |||||
3/14 | Recurrent neural networks | blog1, blog2 | pptx pdf | project proposal due | |
3/16 | Ensemble methods; decision trees | Bishop Sec. 14.2, 14.3 (skip 14.3.1-2), 14.4, Murphy Sec. 16.2 | pptx pdf | ||
3/21 | |||||
3/23 | Classification: probabilistic models | Probability review; density estimation | Bishop Sec. 1.2, 1.6, 2.1, 2.2, 2.3.4, 2.3.9, 2.5, Murphy Sec. 2.1-2.3, 2.4.1, 2.4.5, 2.5.1, 2.6.3, 2.8, 3.2, 3.3.3, 3.4.4.1 | pptx pdf | |
3/28 | Directed graphical models (Bayesian networks) | Bishop Sec. 8.1-2, Murphy Ch. 10 | pptx pdf | HW3 due; HW4 out | |
3/30 | |||||
4/4 | Markov random fields; inference in graphical models | Bishop Sec. 8.3, 8.4.1 | pptx pdf | ||
4/6 | Hidden Markov models | Bishop Sec. 13.1, Jurafsky/Martin | pptx pdf | ||
4/11 | Expectation maximization | Bishop Sec. 9.2, 9.3 (skip 9.3.3-4), Murphy Ch. 11 | pptx pdf | project draft due | |
4/13 | Other topics | pptx pdf | HW4 due 4/16 | ||
4/18 | Projects and exam | Project presentations (schedule) | |||
4/20 | |||||
4/25 | project final report due | ||||
4/27 | Final exam |
[top]
Resources
This course was inspired by the following courses:- Machine Learning by Milos Hauskrecht, University of Pittsburgh, Spring 2015
- Introduction to Machine Learning by Dhruv Batra, Virginia Tech, Spring 2015
- Machine Learning by Tommi Jaakkola, MIT
- Machine Learning by Subhransu Maji, UMass Amhrest, Spring 2015
- Machine Learning by Erik Sudderth, Brown University, Fall 2015
- Computer Vision by Kristen Grauman, UT Austin, Spring 2011
- Computer Vision by Derek Hoiem, UIUC, Spring 2015
- Natural Language Processing by Ray Mooney, UT Austin
- Matlab tutorial
- Linear algebra review by Fei-Fei Li
- Brief machine learning intro by Aditya Khosla and Joseph Lim
- Resources list (including code and data, tutorials, and other related courses) compiled by Devi Parikh
- Microsoft COCO (Common Objects in Context) (object recognition, segmentation, image description)
- ImageNet (object recognition)
- SUN Database (scenes)
- Caltech-UCSD Birds 200 (fine-grained object recognition)
- MSRC Annotations (active learning)
- Animals with Attributes (attribute-based recognition)
- a-Pascal + a-Yahoo (attribute-based recognition)
- Shoes (attribute-based search)
- INRIA Movie Actions (action recognition)
- ADL (ego-centric action recognition)
- Action Quality (evaluating action quality)
- CarDb Historical Cars (style classification of cars)
- Recognizing Image Style (photographic style classification)
- Judd gaze (visual saliency prediction)
- Visual Persuasion (predicting subtle messages in images)
- VQA (visual question-answering)
- Recognition datasets list compiled by Kristen Grauman
- Human activity datasets list compiled by Chao-Yeh Chen
- LIBSVM (by Chih-Chung Chang and Chih-Jen Lin)
- SVM Light (by Thorsten Joachims)
- VLFeat (feature extraction, tutorials and more, by Andrea Vedaldi)
- Caffe (deep learning code by Yangqing Jia et al.)