This course introduces the theoretical, philosophical, and mathematical foundations of Bayesian Statistical inference. Students will learn to apply this foundational knowledge to real-world data science problems. Topics include the use and interpretations of probability theory in Bayesian inference; Bayes’ theorem for statistical parameters; conjugate, improper, and objective priors distributions; data science applications of Bayesian inference; and ethical implications of Bayesian statistics.



Introduction to Bayesian Statistics for Data Science

Instructor: Brian Zaharatos
Access provided by Yenepoya University
Recommended experience
What you'll learn
Implement Bayesian inference to solve real-world statistics and data science problems.
Articulate the logic of Bayesian inference and compare and contrast it with frequentist inference.
Utilize conjugate, improper, and objective priors to find posterior distributions.
Skills you'll gain
Details to know

Add to your LinkedIn profile
5 assignments
See how employees at top companies are mastering in-demand skills

There are 5 modules in this course
This module introduces learners to Bayesian statistics by comparing Bayesian and frequentist methods. The introduction is motivated by an example that illustrates how different assumptions about data collection - specifically, stopping rules - can result in different conclusions when using frequentist methods. Bayesian methods, on the other hand, yield the same conclusion regardless of stopping rules. This example illuminates a key philosophical difference between frequentist and Bayesian methods.
What's included
8 videos4 readings1 assignment3 programming assignments1 discussion prompt2 ungraded labs
This module introduces learners to Bayesian inference through an example using discrete data. The example demonstrates how the posterior distribution is calculated and how uncertainty is quantified in Bayesian statistics. The module also describes methods for summarizing the posterior distribution and introduces learners to the posterior predictive distribution through use of the Monte Carlo simulation. Monte Carlo simulations will be important for advanced computational Bayesian methods.
What's included
6 videos1 assignment1 programming assignment2 ungraded labs
This module introduces learners to methods for conducting Bayesian inference when the likelihood and prior distributions come from a convenient family of distributions, called conjugate families. Conjugate families are a class of prior distributions for which the posterior distribution is in the same class. The module covers the beta-binomial, normal-normal and inverse gamma-normal conjugate families and includes examples of their application to find posterior distributions in R.
What's included
7 videos1 reading1 assignment1 programming assignment2 ungraded labs
This module motivates, defines, and utilizes improper and so-called "objective" prior distributions in Bayesian statistical inference.
What's included
7 videos1 reading1 assignment1 programming assignment2 ungraded labs
In this module, learners will be introduced to Bayesian inference involving more than one unknown parameter. Multiparameter problems are motivated with a simple example: a conjugate prior, two-parameter model involving normally distributed data. From there, we learn to solve more complex problems, including Bayesian linear regression and variance-covariance matrix estimation.
What's included
9 videos1 reading1 assignment1 programming assignment3 ungraded labs
Instructor

Offered by
Why people choose Coursera for their career





