The objective of this subject is to address machine learning problems from a Bayesian perspective. Graphical models (GMs) will be introduced as probabilistic models in which dependence and independence relations between random variables are described in terms of a graph. Similarly, Bayesian networks are a particular case of GMs that are especially useful for modelling conditional independences. Exact inference algorithms will be addressed (such as variable elimination, sum-product and junction tree) and the way they can be applied efficiently. These will be studied in this course alongside with the relation between inference and learning. More general approximate inference methods, either deterministic (e.g. variational inference or expectation propagation) or based on sampling and simulation (e.g. Monte Carlo methods based on Markov chains), will also be introduced in this course.
For more information, please download the teaching guide HERE.