DAMSL-212 Probabilistic Graphical Models

Type

Elective

Course Code

DAMSL-212

Teaching Semester

B semester

ECTS Credits

10

Syllabus

  • Introduction + Probability
  • Random variables and their distributions
  • Bayesian Inference/Frequentist. Inference 
  • Directed Graphical Models 
  • Directed graphical Models, Naive Bayes Classifier
  • Undirected Graphical Models
  • Exact Inference.
  • Exact Inference.
  • Monte Carlo Sampling
  • Learning PGMs-Parameter Learning
  • Learning PGMs-Structure Learning
  • Causality.
  • Expectation-Maximization

Learning Outcomes

This is a graduate-level introduction to the principles of statistical inference with probabilistic models defined using graphical representations. Probabilistic graphical modeling and inference is a powerful modern approach to representing the combined statistics of data and models, reasoning about the world in the face of uncertainty, and learning about it from data. This course will provide a solid introduction to the methodology and associated techniques. 

The objective of this course is for students to develop a solid understanding of probabilistic graphical models, learn how to apply them to diverse problems. Students are expected to become familiar with the following concepts: Bayesian methodology, conditional independence, model selection, directed graphical models (Bayes nets), undirected graphical models (Markov random fields, factor graphs), exact inference on graphs using message passing, expressing model learning as inference, approximate inference for missing value problems using expectation maximization (EM), variational inference, sampling probability distributions using Markov chain Monte Carlo (MCMC). Specific Topics Include:

  1. Creating both directed and undirected graphical models for data.
  2. Identifying conditional independencies in graphical models.
  3. Specifying distributions for parameters of model components that link the model to data.
  4. Applying exact and approximate inference methods to compule manginal probabilities and maximally probable configurations given a model (sum-product and max-sum algorithms, respectively, Monte Carlo sampling methods).
  5. Applying approximate inference to learn model parameters using expectation maximization (EM algorithm) and variational inference.