This course is part of the Probabilistic Graphical Models Specialization

Offered By

Probabilistic Graphical Models Specialization

Stanford University

About this Course

4.7

980 ratings

•

221 reviews

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems.
This course is the first in a sequence of three. It describes the two basic PGM representations: Bayesian Networks, which rely on a directed graph; and Markov networks, which use an undirected graph. The course discusses both the theoretical properties of these representations as well as their use in practice. The (highly recommended) honors track contains several hands-on assignments on how to represent some real-world problems. The course also presents some important extensions beyond the basic PGM representation, which allow more complex models to be encoded compactly.

Start instantly and learn at your own schedule.

Reset deadlines in accordance to your schedule.

Suggested: 7 hours/week...

Subtitles: English

Bayesian NetworkGraphical ModelMarkov Random Field

Start instantly and learn at your own schedule.

Reset deadlines in accordance to your schedule.

Suggested: 7 hours/week...

Subtitles: English

Week

1This module provides an overall introduction to probabilistic graphical models, and defines a few of the key concepts that will be used later in the course....

4 videos (Total 35 min), 1 quiz

Basic Definitions8m

In this module, we define the Bayesian network representation and its semantics. We also analyze the relationship between the graph structure and the independence properties of a distribution represented over that graph. Finally, we give some practical tips on how to model a real-world situation as a Bayesian network....

15 videos (Total 190 min), 6 readings, 4 quizzes

Reasoning Patterns9m

Flow of Probabilistic Influence14m

Conditional Independence12m

Independencies in Bayesian Networks18m

Naive Bayes9m

Application - Medical Diagnosis9m

Knowledge Engineering Example - SAMIAM14m

Basic Operations 13m

Moving Data Around 16m

Computing On Data 13m

Plotting Data 9m

Control Statements: for, while, if statements 12m

Vectorization 13m

Working on and Submitting Programming Exercises 3m

Setting Up Your Programming Assignment Environment10m

Installing Octave/MATLAB on Windows10m

Installing Octave/MATLAB on Mac OS X (10.10 Yosemite and 10.9 Mavericks)10m

Installing Octave/MATLAB on Mac OS X (10.8 Mountain Lion and Earlier)10m

Installing Octave/MATLAB on GNU/Linux10m

More Octave/MATLAB resources10m

Bayesian Network Fundamentals6m

Bayesian Network Independencies10m

Octave/Matlab installation2m

Week

2In many cases, we need to model distributions that have a recurring structure. In this module, we describe representations for two such situations. One is temporal scenarios, where we want to model a probabilistic structure that holds constant over time; here, we use Hidden Markov Models, or, more generally, Dynamic Bayesian Networks. The other is aimed at scenarios that involve multiple similar entities, each of whose properties is governed by a similar model; here, we use Plate Models....

4 videos (Total 66 min), 1 quiz

Temporal Models - DBNs23m

Temporal Models - HMMs12m

Plate Models20m

Template Models20m

A table-based representation of a CPD in a Bayesian network has a size that grows exponentially in the number of parents. There are a variety of other form of CPD that exploit some type of structure in the dependency model to allow for a much more compact representation. Here we describe a number of the ones most commonly used in practice....

4 videos (Total 49 min), 3 quizzes

Tree-Structured CPDs14m

Independence of Causal Influence13m

Continuous Variables13m

Structured CPDs8m

BNs for Genetic Inheritance PA Quiz22m

Week

3In this module, we describe Markov networks (also called Markov random fields): probabilistic graphical models based on an undirected graph representation. We discuss the representation of these models and their semantics. We also analyze the independence properties of distributions encoded by these graphs, and their relationship to the graph structure. We compare these independencies to those encoded by a Bayesian network, giving us some insight on which type of model is more suitable for which scenarios....

7 videos (Total 106 min), 3 quizzes

General Gibbs Distribution15m

Conditional Random Fields22m

Independencies in Markov Networks4m

I-maps and perfect maps20m

Log-Linear Models22m

Shared Features in Log-Linear Models8m

Markov Networks8m

Independencies Revisited6m

Week

4In this module, we discuss the task of decision making under uncertainty. We describe the framework of decision theory, including some aspects of utility functions. We then talk about how decision making scenarios can be encoded as a graphical model called an Influence Diagram, and how such models provide insight both into decision making and the value of information gathering....

3 videos (Total 61 min), 3 quizzes

Decision Theory8m

Decision Making PA Quiz18m

4.7

221 Reviewsstarted a new career after completing these courses

got a tangible career benefit from this course

got a pay increase or promotion

By ST•Jul 13th 2017

Prof. Koller did a great job communicating difficult material in an accessible manner. Thanks to her for starting Coursera and offering this advanced course so that we can all learn...Kudos!!

By CM•Oct 23rd 2017

The course was deep, and well-taught. This is not a spoon-feeding course like some others. The only downside were some "mechanical" problems (e.g. code submission didn't work for me).

The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto, California, United States....

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems....

When will I have access to the lectures and assignments?

Once you enroll for a Certificate, you’ll have access to all videos, quizzes, and programming assignments (if applicable). Peer review assignments can only be submitted and reviewed once your session has begun. If you choose to explore the course without purchasing, you may not be able to access certain assignments.

What will I get if I subscribe to this Specialization?

When you enroll in the course, you get access to all of the courses in the Specialization, and you earn a certificate when you complete the work. Your electronic Certificate will be added to your Accomplishments page - from there, you can print your Certificate or add it to your LinkedIn profile. If you only want to read and view the course content, you can audit the course for free.

What is the refund policy?

Is financial aid available?

Learning Outcomes: By the end of this course, you will be able to

Apply the basic process of representing a scenario as a Bayesian network or a Markov network

Analyze the independence properties implied by a PGM, and determine whether they are a good match for your distribution

Decide which family of PGMs is more appropriate for your task

Utilize extra structure in the local distribution for a Bayesian network to allow for a more compact representation, including tree-structured CPDs, logistic CPDs, and linear Gaussian CPDs

Represent a Markov network in terms of features, via a log-linear model

Encode temporal models as a Hidden Markov Model (HMM) or as a Dynamic Bayesian Network (DBN)

Encode domains with repeating structure via a plate model

Represent a decision making problem as an influence diagram, and be able to use that model to compute optimal decision strategies and information gathering strategies

Honors track learners will be able to apply these ideas for complex, real-world problems

More questions? Visit the Learner Help Center.

Coursera provides universal access to the world’s best education,
partnering with top universities and organizations to offer courses online.