Chevron Left
Back to Bayesian Statistics: Mixture Models

Learner Reviews & Feedback for Bayesian Statistics: Mixture Models by University of California, Santa Cruz

About the Course

Bayesian Statistics: Mixture Models introduces you to an important class of statistical models. The course is organized in five modules, each of which contains lecture videos, short quizzes, background reading, discussion prompts, and one or more peer-reviewed assignments. Statistics is best learned by doing it, not just watching a video, so the course is structured to help you learn through application. Some exercises require the use of R, a freely-available statistical software package. A brief tutorial is provided, but we encourage you to take advantage of the many other resources online for learning R if you are interested. This is an advanced course, and it was designed to be the third in UC Santa Cruz's series on Bayesian statistics, after Herbie Lee's "Bayesian Statistics: From Concept to Data Analysis" and Matthew Heiner's "Bayesian Statistics: Techniques and Models." To succeed in the course, you should have some knowledge of and comfort with calculus-based probability, principles of maximum-likelihood estimation, and Bayesian estimation....
Filter by:

1 - 1 of 1 Reviews for Bayesian Statistics: Mixture Models

By Rohit D

Jun 19, 2020

Bayesian Statistics: Mixture Models (BS3 for short)

In June 2020, BS3 is a new class. It appears that this class came to Coursera circa April 2020.

The class creators (Prof. Abel Rodriguez and others) have done an excellent job of pulling-together the requisite theory (video lectures) and practice (assignments in R).

For most people, including those with a modest amount of training in statistics or computer science, this class will feel like an advanced class. To reasonably comprehend the material, one needs to be familiar with Monte Carlo simulations (specifically Gibbs Sampling) and a broad spectrum of probability distributions (Poisson, Beta, Gamma, Inverse-Gamma, Log-Normal, Dirichlet) used in Bayesian statistics. The first two Bayesian Statistics classes cover most of these pre-requisites well.

BS3 delves into two ways of estimating mixtures, namely Expectation-Maximization (EM) and Gibbs sampling, and comparing results from the alternate approaches. BS3 does not stop at a "Gaussian Mixture of two Univariate distributions." Through its assignments, this class motivates the need for other mixture models such as zero-inflated Poisson distribution, a mixture of exponential and Log-Normal distribution, and a mixture of multivariate Gaussian distributions.

Some assignments require manipulation of hierarchical probability distributions using multiple techniques - Maximum Likelihood Estimation, detecting Conjugate Priors, Simulations - simultaneously. Since the manipulations are coded in R and need to achieve a numerical result, typos and algebraic manipulation errors are unforgiving.

The class organizers chose to have graded assignments (six in all) peer-reviewed. The peer review requirement can feel like a constraint for a class that is relatively new and advanced, and thus has low attendance.

It took me ~60 hours to complete this class over approximately two weeks. Ideally, I would have preferred to spread the course out over the recommended five-weeks. Life constraints dictated otherwise. Even so, the effort is well worth it. I am walking away with a much better appreciation of Bayesian Statistics in general and Mixture Models in particular.