AnnouncementsThe projects grading rubric has been posted.
The presentation schedule has been posted.
HW4 is out and is due 4/18.
OverviewCourse description: The course will cover the following topics: learning basics, unsupervised learning, supervised learning, classification, regression, clustering, dimensionality reduction, nearest neighbor classification, support vector machines, density estimation, Bayesian belief networks, Hidden Markov models, expectation maximization, decision trees, ensembles, deep learning, active and transfer learning, and information retrieval. The course will include many examples of how machine learning is used in computer vision. The homework assignments will have some bias towards applying machine learning techniques to computer vision problems and datasets. There will be two exams and a final project.
Prerequisites: Knowledge of matrices and linear algebra (CS 0280), probability (STAT 1151), statistics (STAT 1000), programming and algorithm development and analysis (CS 1501) or equivalent, or the permission of the instructor.
Programming: Homework assignments will be written in Matlab. The final project can be written in any language.
Textbook: Christopher M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006. book resources
GradingGrading will be based on the following components:
- Homework (40%)
- Project (20%)
- Status presentation and report (5%)
- Final presentation and report (15%)
- Midterm exam (15%)
- Final exam (20%)
- Participation (5%)
HomeworkThere will be four homework assignments. You will submit your homework using CourseWeb. Navigate to the CourseWeb page for CS2750, then click on "Assignments" (on the left) and the corresponding homework number. Attach, in a single zip file, your written responses and code. Name the file as YourFirstName_YourLastName.zip or YourFirstName_YourLastName.tar. Homework is due at 11:59pm on the due date.
ProjectStudents are encouraged to work in groups of two (see exceptions below) for their final project. A project can be:
- design of a new method for some problem (which may or may not extend existing methods; you don't have to invent a new SVM, but can for example show an algorithm for solving some problem that incorporates or modifies SVMs or other classifiers organized and interleaved in some fashion)
- an implemenation of some method discussed in class (or other method, with instructor's approval; note that you should also discuss algorithmic choices and trade-offs)
- an application of techniques we studied in class (or another method, with instructor's approval) to a new problem that we have not discussed in class (note that (1) you should also describe why this problem is important and challenging, (2) I expect this to be a significant amount of work and not just a straight-up run of some package on existing data, and (3) you are allowed to use existing code for known methods)
- an implementation of a real working system (e.g. an app) that can solve some machine learning task (note you should also discuss design challenges and evaluate your system quantitatively and qualitatively with real users)
- experimental comparison of a number of existing techniques on a known problem and detailed discussion and analysis of the results (this one can only be done by students working individually)
- an extensive literature review and analysis on one of the topics covered in class (this one can only be done by students working individually)
- other (speak with the instructor)
Ideas: For computer vision project ideas, you can look at the list of datasets and tasks below for inspiration, or read some paper abstracts on this page. For NLP project ideas, see this page from Christopher Manning. Also look at the following list of project suggestions from Ray Mooney (but please do NOT contact any of the contacts given), this one from Carlos Guestrin, this one from Andreas Krause, and this one from Andrew Ng.
Timeline and deliverables: You will submit a 2-page project proposal in February and receive feedback from the instructor. In the proposal, describe what techniques and data you plan to use, and what existing work there is on the subject. In late March, you will present your progress to your classmates for feedback. Describe your progress on the project, and any problems encountered along the way. At the end of the semester, you will present your final project and submit a final project report using the CVPR latex template. The final report should resemble a conference paper and should include clear problem definition and argumentation of why this problem is important, overview of related work, detailed explanation of the approach, well-motivated experimental evaluation, including setup description, and a description of what each team member did. In the final presentation, describe your approach and experimental findings in a clear and engaging fashion. Please look at this project grading rubric.
All project written items are due at 11:59pm on CourseWeb. Status report persentations will be 4 minutes long and final presentations will be 8 minutes long. If you described something in the status report presentation, don't repeat it (except maybe with one sentence) in the final presentation.
ExamsThere will be both a midterm exam and a final exam (the latter of which will mostly focus on material from the second part of the class, but will be cumulative).
ParticipationStudents are expected to regularly attend the class lectures, and should actively engage in in-class discussions. Your participation grade will be based on how actively you participated in class. You can actively participate by, for example, responding to the instructor's or others' questions, asking questions or making meaningful remarks and comments about the lecture, or posting questions or responses on Piazza. You are also encouraged to bring in relevant articles you saw in the news.
Late PolicyYou get 3 "free" late days, i.e., you can submit homework a total of 3 days late. For example, you can submit one problem set 12 hours late, and another 60 hours late. Once you've used up your free late days, you will incur a penalty of 25% from the total project credit possible for each late day. A late day is anything from 1 minute to 24 hours.
Collaboration Policy and Academic HonestyYou will do your work (exams and homework) individually. The work you turn in must be your own work. You are allowed to discuss the problem sets with your classmates, but do not look at code they might have written for the problem sets. You are also not allowed to search for code on the internet, use solutions posted online unless you are explicitly allowed to look at those, or to use Matlab's implementation if you are asked to write your own code. When in doubt about what you can or cannot use, ask the instructor! Plagiarism will cause you to fail the class and receive disciplinary penalty. Please consult the University Guidelines on Academic Integrity.
Note on DisabilitiesIf you have a disability for which you are or may be requesting an accommodation, you are encouraged to contact both your instructor and Disability Resources and Services (DRS), 140 William Pitt Union, (412) 648-7890, firstname.lastname@example.org, (412) 228-5347 for P3 ASL users, as early as possible in the term. DRS will verify your disability and determine reasonable accommodations for this course.
Note on Medical ConditionsIf you have a medical condition which will prevent you from doing a certain assignment or coming to class, you must inform the instructor of this before the deadline. You must then submit documentation of your condition within a week of the assignment deadline.
Statement on Classroom RecordingTo ensure the free and open discussion of ideas, students may not record classroom lectures, discussion and/or activities without the advance written permission of the instructor, and any such recording properly approved in advance can be used solely for the student's own private use.
|1/6||Basics||Introduction and administrativia||this website||[pptx] [pdf]||HW1 out|
|1/11||Matlab; ML tasks, notation, and challenges||Bishop Ch. 1, Sec. 3.2||[pptx] [pdf]|
|1/13||More Matlab; Bias-variance trade-off||[pptx] [pdf]|
|1/18||No class (MLK Day)|
|1/20||Bias-variance trade-off (cont'd);
|Szeliski Sec. 4.1, Grauman/Leibe Ch. 1-3||[pptx] [pdf]|
|1/25||Unsupervised learning||Clustering||Bishop Ch. 9.1||[pptx] [pdf]|
|1/27||Dimensionality reduction||Bishop Sec. 12.1; Daume||[pptx] [pdf]|
Intro and linear models
|Nearest neighbors||Bishop Sec. 1.4, 2.5.2||[pptx] [pdf] [notes]||HW1 due; HW2 out|
|2/3||Nearest neighbors (cont'd);
Linear algebra review
|2/8||Review (cont'd)||Bishop Sec. 3.1||[pptx] [pdf]|
|2/10||Linear regression||Bishop Sec. 3.1||[pptx] [pdf]|
|2/15||Linear models for classification||Bishop Sec. 1.5, 4.1-3||[pptx] [pdf]
|2/17||Linear models for classification (cont'd); Support vector machines||Bishop Sec. 6.1-2, 7.1||[pptx] [pdf]
|2/22||Support vector machines (cont'd)|
|2/24||Support vector machines (optimization solution)||[notes]|
|2/29||Classification: Probabilistic models||Probability review; Density estimation||Bishop Sec. 1.2, 1.6, 2.1, 2.2, 2.3.4, 2.3.9, 2.5||[pptx] [pdf]
|HW2 due; HW3 out|
|3/7||No class (spring break)|
|3/14||Bayesian belief networks||Bishop Sec. 8.1-2||[pptx] [pdf]|
|3/16||Bayesian belief networks (cont'd)|
|3/21||Markov random fields;
Hidden Markov models
|Bishop Sec. 8.3.1-2, 13.1-2 (skip 13.2.1-6); extra: Jurafsky/Martin||[ppt] [pdf]|
|3/23||Hidden Markov models (cont'd)||[pptx] [pdf]||HW3 due; HW4 out|
|3/28||Hidden Markov models (cont'd)||[pptx] [pdf]|
|3/30||Expectation maximization||Bishop Sec. 9.2||[pptx] [pdf]|
|4/4||Review (ChangSheng)||[pptx] [pdf]|
|4/6||Project status report presentations||status report due|
|Ensembles; bagging and boosting;
|Bishop Sec. 14.2, 14.3 (skip 14.3.1-2), 14.4||[pptx] [pdf]|
|4/13||Neural networks||Bishop Sec. 5.1, 5.2 (skip 5.2.2-3), 5.3.1-3, 5.5.2,3,6||[pptx] [pdf]|
|4/18||Neural networks (cont'd);
|4/27||final report due Friday|
ResourcesThis course was inspired by the following courses:
- Machine Learning by Milos Hauskrecht, University of Pittsburgh, Spring 2015
- Introduction to Machine Learning by Dhruv Batra, Virginia Tech, Spring 2015
- Machine Learning by Tommi Jaakkola, MIT
- Machine Learning by Subhransu Maji, UMass Amhrest, Spring 2015
- Machine Learning by Erik Sudderth, Brown University, Fall 2015
- Computer Vision by Kristen Grauman, UT Austin, Spring 2011
- Computer Vision by Derek Hoiem, UIUC, Spring 2015
- Natural Language Processing by Ray Mooney, UT Austin
- Matlab tutorial
- Linear algebra review by Fei-Fei Li
- Brief machine learning intro by Aditya Khosla and Joseph Lim
- Resources list compiled by Devi Parikh
- Microsoft COCO (Common Objects in Context) (object recognition, segmentation, image description)
- ImageNet (object recognition)
- SUN Database (scenes)
- Caltech-UCSD Birds 200 (fine-grained object recognition)
- MSRC Annotations (active learning)
- Animals with Attributes (attribute-based recognition)
- a-Pascal + a-Yahoo (attribute-based recognition)
- Shoes (attribute-based search)
- INRIA Movie Actions (action recognition)
- ADL (ego-centric action recognition)
- Action Quality (evaluating action quality)
- CarDb Historical Cars (style classification of cars)
- Recognizing Image Style (photographic style classification)
- Judd gaze (visual saliency prediction)
- Visual Persuasion (predicting subtle messages in images)
- VQA (visual question-answering)
- Recognition datasets list compiled by Kristen Grauman
- Human activity datasets list compiled by Chao-Yeh Chen
- LIBSVM (by Chih-Chung Chang and Chih-Jen Lin)
- SVM Light (by Thorsten Joachims)
- VLFeat (feature extraction, tutorials and more, by Andrea Vedaldi)
- GIST feature extraction (by Aude Oliva and Antonio Torralba)
- Caffe (deep learning code by Yangqing Jia et al.)