Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems.
This course is part of the Probabilistic Graphical Models Specialization
Offered By
About this Course
Skills you will gain
- Inference
- Gibbs Sampling
- Markov Chain Monte Carlo (MCMC)
- Belief Propagation
Offered by
Syllabus - What you will learn from this course
Inference Overview
Variable Elimination
Belief Propagation Algorithms
MAP Algorithms
Sampling Methods
Inference in Temporal Models
Reviews
- 5 stars71.33%
- 4 stars21.12%
- 3 stars5.23%
- 2 stars1.04%
- 1 star1.25%
TOP REVIEWS FROM PROBABILISTIC GRAPHICAL MODELS 2: INFERENCE
Great introduction.
It would be great to have more examples included in the lectures and slides.
Great course! Expect to spend significant time reviewing the material.
I would have like to complete the honors assignments, unfortunately, I'm not fluent in Matlab. Otherwise, great course!
I learned pretty much from this course. It answered my quandaries from the representation course, and as well deepened my understanding of PGM.
About the Probabilistic Graphical Models Specialization

Frequently Asked Questions
When will I have access to the lectures and assignments?
What will I get if I subscribe to this Specialization?
Is financial aid available?
Learning Outcomes: By the end of this course, you will be able to take a given PGM and
More questions? Visit the Learner Help Center.