Time: Tuesday, Thursday
9:00am - 10:15am
Location: Sennott Square, Room 5313
Instructor: Milos
Hauskrecht
Computer Science Department
5329 Sennott Square
phone: x4-8845
e-mail: milos_at_cs_pitt_edu
office hours: TBA
Grahical models: Bayesian Belief Networks, Markov Random Fields: introductory text
BBNs and MRFs conversions
Exact inference algorithms
Approximate inference: Monte Carlo methods:
Approximate inferences: Variational methods:
Learning graphical models from data (not covered in Fall 2020)
Learning BBNs:
Learning MRFs:
Data representation, generative models of data
Dimensionality reduction: PCA, Auto-encoders
Probabilistic latent variable models: Probabilistic PCA, Factor analysis, CVQ, NOCA
Modern deep generative models: Restricted Boltzman machines, Variational autoencoders
Generative Adversarial Networks
Models for document analysis, information retrieval and link analysis
Singular value decomposition (SVD), Applications to: LSI, link analysis
Probabilistic latent semantic analysis (pLSA), Latent Dirichlet Allocation (LDA)
Word and word similarity models: word2vec, CBOW, graph-based models
Time series and sequence models
Discrete-state probabilistic models: Markov models, Hidden Markov models, Dynamic Belief networks
Continuous-state probabilistic models: Autoregressive models, Linear Dynamical model (LDM)
Modern autoregressive models: Recurrent neural networks (LSTMs, GRUs)
Topics in deep learning
Self-attention mechanisms, transformers and all that <\b>
Convolutional neural networks