Instructor: **Milos
Hauskrecht**

Computer Science Department

5329 Sennott Square

phone: x4-8845

e-mail: * milos_at_cs.pitt.edu*

- Judea Pearl. Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann Publishers, Palo Alto, CA, USA, 1988.

- G. F. Cooper. The computational complexity of probabilistic inference using Bayesian belief networks. Artificial Intelligence, vol. 42, no. 2-3, pages 393--405, March 1990.
- P. Dagum. M. Luby. Approximating probabilistic inference in Bayesian belief networks is NP-hard. Artificial Intelligence, vol. 60, pages 141-- 153, 1993.
- D. Chickering. Learning bayesian networks is np-complete . Proceedings of AI and Statistics, 1995.

- Patrick Perez. Markov Random Fields and Images. CWI Quaterly, vol. 11 (4), 1998.

- R. Dechter. Bucket elimination.

- Judea Pearl. Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann Publishers, Palo Alto, CA, USA, 1988.

- Lauritzen, S. and D. Spiegelhalter, Local computations with probabilities on graphical structures and their applications to expert systems, J. Royal Statistical Society B, 50, pp. 157-224, 1988.

- G. Cooper. Bayesian Belief-Network Inference Using Recursive Decomposition.Technical Report, SMI-90-0291, Stanford University, 1992.
- Adnan Darwiche. Recursive conditioning. Arificial Intelligence Journal, 2000.

- Friedman, Goldszmidt. Learning Bayesian networks with local structure. Uncertainty in AI, 1996.
- D. Heckerman. Similarity networks. Networks, vol 20, 1990.
- D. Geiger and D. Heckerman. Beyond Bayesian networks: Similarity networks and Bayesian multinets. Artificial Intelligence, 82:45-74, 1996.
- D. Heckerman. A tractable inference algorithm for diagnosing multiple diseases. ksl-89-36, 1989
- Adnan Darwiche A differential approach to Inference in Bayesian Networks

- Rubinstein. Simulation and the Monte Carlo Method. 1981.
- David MacKay. Introduction to Monte Carlo methods.
- Andrieu et al. An introduction to MCMC for Machine Learning. Machine Learning, vol. 50, pp.5-43, 2003.

- Jonathan Yedidia, William Freeman, Yair Weiss. Constructing Free Energy Approximations and Generalized Belief propagation Algorithms. TR, MERL, 2004.

- Jordan, M., Ghaharamani, Z. Jaakkola, T., and Saul, L. An introduction to variational methods for graphical models. In M. I. Jordan (Ed.), Learning in Graphical Models, 1998.
- Wim Wiegerinck. Variational Approximations between Mean Field Theory and the Junction Tree Algorithm. Proceedings of Uncertainty in AI, pages 626--633. Morgan Kaufmann, 2000.
- Hilbert J. Kappen, Wim J. Wiegerinck. Mean field theory for graphical models. In M. Opper and D. Saad, editors, Adavanced Mean Field Theory -- Theory and Practice, chapter 4, pages 37--49. MIT Press, 2001.
- Z. Ghahramani and M. J. Beal. Graphical Models and Variational Methods. In M. Opper and D. Saad (eds.), Advanced Mean Field Methods --- Theory and Practice, MIT Press, 2001.

- Wray Buntine. A Guide to the Literature on Learning Probabilistic Networks From Data. 1996.
- Wray Buntine Operations for learning with graphical models. Journal of AI research, 1994.
- David Heckerman A Tutorial on Learning With Bayesian Networks 1996.

- A.P. Dempster, N.M. Laird, D.B. Rubin. Maximum likelihood from incomplete data via the EM algorithm. Journal of Royal statistical society, vol. 39, issue 1, pp. 1-28, 1977.
- Jeff Bilmes. A gentle tutorial of the EM algorithm and its application to parameter estimation of Gaussian mixture and Hidden Markov models.

- Nir Friedman. The Bayesian structural EM algorithm. . In Fourteenth Conf. on Uncertainty in Artificial Intelligence (UAI). 1998
- Moninder Singh. Learning Bayesian networks from Incomplete Data.
AAAI,1997.
## Learning MRFs

- R. Jirousek and S. Preucil. On the effective implementation of the iterative proportional fitting procedure. Computational Statistics & Data Analysis, 19:177--189, 1995.

Last updated by Milos on 09/16/2005