Applied Bayesian methods

Semester
Second semester: UAM, Madrid
Credit
6
Class Type
Practical, Theory
Type of the exam
50% Lab assignments, 50% theory exam
Prerequisites (if exist)
Lecturer
Gonzalo Maríne Muñoz, Ph.D., Associate Professor
Hours per week
2+1

Content

The objective of this subject is to address machine-learning problems from a Bayesian perspective. Graphical models (GMs) will be introduced as probabilistic models in which dependence and independence relations between random variables are described in terms of a graph. Similarly, Bayesian networks are a particular case of GMs that are especially useful for modeling conditional independences. Exact inference algorithms will be addressed (such as variable elimination, sum-product and junction tree) and the way they can be applied efficiently. These will be studied in this course alongside with the relation between inference and learning. More general approximate inference methods, either deterministic (e.g. Variational inference or expectation propagation) or based on sampling and simulation (e.g. Monte Carlo methods based on Markov chains), will also be introduced in this course.

Detailed program:

  1. Probabilistic Reasoning
    1. Introduction to probability theory: Bayes theorem, marginals, conditional probabilities.
    2. Introduction to probabilistic reasoning: Prior, likelihood and posterior.
    3. Bayesian Networks fundamentals.
    4. Markov Networks fundamentals.
  2. Inference in Probabilistic Models
    1. Variable elimination.
    2. Sum-product algorithm.
    3. Junction tree algorithm.
  3. Learning in Probabilistic Models
    1. Maximum likelihood training of Bayesian Networks.
    2. Bayesian inference for Bayesian Networks.
    3. Expectation maximization, EM algorithm.
  4. Approximate Inference
    1. Loopy Belief propagation.
    2. Deterministic Methods.
  5. The Laplace approximation
  6. Variational Inference. Expectation Propagation
    1. Montecarlo methods.

 

Recommended reading

David Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press 2012.
William M. Bostad. Introduction to Bayesian Statistics. Wiley-Interscience, 2007.
Christopher M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006.
Koller, D. & Friedman, N. Probabilistic Graphical Models: Principles and Techniques MIT Press, 2009.
Richard E. Neapolitan. Learning Bayesian Networks. Pearson Prentice Hall, 2004.
David J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003.Introduction to time series and forecasting , P.J. Brockwell, R. A. Davis, Springer Texts in Statistics (1996)

Additional lecturers, if exist(name, position, degree): Alejandro Sierra Urrecho, Ph.D., Associate Professor ; Daniel Hernández Lobato, Ph.D., Associate Professor

We are using cookies to give you the best experience. You can find out more about which cookies we are using or switch them off in privacy settings.
AcceptPrivacy Settings

GDPR