All students must validate 30 ECTS per semester.We offer two speciality tracks:

Track 1: Vision and Applications
The first track focuses on various applications of computer vision: biomedical applications, people detection, object tracking, computational photography.

Track 2: Vision and devices
The second track focuses on devices to capture images (intelligent sensors, medical imaging systems) and to visualise and interact with them (augmented reality)




Course name: Applied Bayesian methods Credits: 6

  • Class type: Theory+Practice
  • Hours per week: 2+1
  • Type of the exam: 50% Lab assigments, 50% theory exam


The objective of this subject is to address machine-learning problems from a Bayesian perspective. Graphical models (GMs) will be introduced as probabilistic models in which dependence and independence relations between random variables are described in terms of a graph. Similarly, Bayesian networks are a particular case of GMs that are especially useful for modeling conditional independences. Exact inference algorithms will be addressed (such as variable elimination, sum-product and junction tree) and the way they can be applied efficiently. These will be studied in this course alongside with the relation between inference and learning. More general approximate inference methods, either deterministic (e.g. Variational inference or expectation propagation) or based on sampling and simulation (e.g. Monte Carlo methods based on Markov chains), will also be introduced in this course.

Detailed program:


  1. Probabilistic Reasoning
    1. Introduction to probability theory: Bayes theorem, marginals, conditional probabilities.
    2. Introduction to probabilistic reasoning: Prior, likelihood and posterior.
    3. Bayesian Networks fundamentals.
    4. Markov Networks fundamentals.
  2. Inference in Probabilistic Models
    1. Variable elimination.
    2. Sum-product algorithm.
    3. Junction tree algorithm.
  3. Learning in Probabilistic Models
    1. Maximum likelihood training of Bayesian Networks.
    2. Bayesian inference for Bayesian Networks.
    3. Expectation maximization, EM algorithm.
  4. Approximate Inference
    1. Loopy Belief propagation.
    2. Deterministic Methods.
  5. The Laplace approximation
  6. Variational Inference. Expectation Propagation
    1. Montecarlo methods.

Recommended reading

  1. David Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press 2012.
  2. William M. Bostad. Introduction to Bayesian Statistics. Wiley-Interscience, 2007.
  3. Christopher M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006.
  4. Koller, D. & Friedman, N. Probabilistic Graphical Models: Principles and Techniques MIT Press, 2009.
  5. Richard E. Neapolitan. Learning Bayesian Networks. Pearson Prentice Hall, 2004.
  6. David J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003.Introduction to time series and forecasting , P.J. Brockwell, R. A. Davis, Springer Texts in Statistics (1996)
  • Lecturer (name, position, degree): Gonzalo Maríne Muñoz, Ph.D., Associate Professor
  • Additional lecturers, if exist(name, position, degree): Alejandro Sierra Urrecho, Ph.D., Associate Professor ; Daniel Hernández Lobato, Ph.D., Associate Professor

The European Credit Transfer and Accumulation System (ECTS) is a student-centred system based on the student workload required to achieve the objectives of a programme of study. Its aim is to facilitate the recognition of study periods undertaken abroad by mobile students through the transfer credits. The ECTS is based on the principle that 60 credits are equivalent to the workload of full-time student during one academic year.