About this Course

193,598 recent views

Learner Career Outcomes

44%

started a new career after completing these courses

43%

got a tangible career benefit from this course

18%

got a pay increase or promotion
Shareable Certificate
Earn a Certificate upon completion
100% online
Start instantly and learn at your own schedule.
Flexible deadlines
Reset deadlines in accordance to your schedule.
Approx. 22 hours to complete
English

Skills you will gain

Linear RegressionRidge RegressionLasso (Statistics)Regression Analysis

Learner Career Outcomes

44%

started a new career after completing these courses

43%

got a tangible career benefit from this course

18%

got a pay increase or promotion
Shareable Certificate
Earn a Certificate upon completion
100% online
Start instantly and learn at your own schedule.
Flexible deadlines
Reset deadlines in accordance to your schedule.
Approx. 22 hours to complete
English

Offered by

Placeholder

University of Washington

Syllabus - What you will learn from this course

Content RatingThumbs Up94%(19,110 ratings)Info
Week
1

Week 1

1 hour to complete

Welcome

1 hour to complete
5 videos (Total 20 min), 3 readings
5 videos
What is the course about?3m
Outlining the first half of the course5m
Outlining the second half of the course5m
Assumed background4m
3 readings
Important Update regarding the Machine Learning Specialization10m
Slides presented in this module10m
Reading: Software tools you'll need10m
4 hours to complete

Simple Linear Regression

4 hours to complete
25 videos (Total 122 min), 5 readings, 2 quizzes
25 videos
Regression fundamentals: data & model8m
Regression fundamentals: the task2m
Regression ML block diagram4m
The simple linear regression model2m
The cost of using a given line6m
Using the fitted line6m
Interpreting the fitted line6m
Defining our least squares optimization objective3m
Finding maxima or minima analytically7m
Maximizing a 1d function: a worked example2m
Finding the max via hill climbing6m
Finding the min via hill descent3m
Choosing stepsize and convergence criteria6m
Gradients: derivatives in multiple dimensions5m
Gradient descent: multidimensional hill descent6m
Computing the gradient of RSS7m
Approach 1: closed-form solution5m
Approach 2: gradient descent7m
Comparing the approaches1m
Influence of high leverage points: exploring the data4m
Influence of high leverage points: removing Center City7m
Influence of high leverage points: removing high-end towns3m
Asymmetric cost functions3m
A brief recap1m
5 readings
Slides presented in this module10m
Optional reading: worked-out example for closed-form solution10m
Optional reading: worked-out example for gradient descent10m
Download notebooks to follow along10m
Fitting a simple linear regression model on housing data10m
2 practice exercises
Simple Linear Regression30m
Fitting a simple linear regression model on housing data30m
Week
2

Week 2

4 hours to complete

Multiple Regression

4 hours to complete
19 videos (Total 87 min), 5 readings, 3 quizzes
19 videos
Polynomial regression3m
Modeling seasonality8m
Where we see seasonality3m
Regression with general features of 1 input2m
Motivating the use of multiple inputs4m
Defining notation3m
Regression with features of multiple inputs3m
Interpreting the multiple regression fit7m
Rewriting the single observation model in vector notation6m
Rewriting the model for all observations in matrix notation4m
Computing the cost of a D-dimensional curve9m
Computing the gradient of RSS3m
Approach 1: closed-form solution3m
Discussing the closed-form solution4m
Approach 2: gradient descent2m
Feature-by-feature update9m
Algorithmic summary of gradient descent approach4m
A brief recap1m
5 readings
Slides presented in this module10m
Optional reading: review of matrix algebra10m
Exploring different multiple regression models for house price prediction10m
Numpy tutorial10m
Implementing gradient descent for multiple regression10m
3 practice exercises
Multiple Regression30m
Exploring different multiple regression models for house price prediction30m
Implementing gradient descent for multiple regression30m
Week
3

Week 3

3 hours to complete

Assessing Performance

3 hours to complete
14 videos (Total 93 min), 2 readings, 2 quizzes
14 videos
What do we mean by "loss"?4m
Training error: assessing loss on the training set7m
Generalization error: what we really want8m
Test error: what we can actually compute4m
Defining overfitting2m
Training/test split1m
Irreducible error and bias6m
Variance and the bias-variance tradeoff6m
Error vs. amount of data6m
Formally defining the 3 sources of error14m
Formally deriving why 3 sources of error20m
Training/validation/test split for model selection, fitting, and assessment7m
A brief recap1m
2 readings
Slides presented in this module10m
Polynomial Regression10m
2 practice exercises
Assessing Performance30m
Exploring the bias-variance tradeoff30m
Week
4

Week 4

4 hours to complete

Ridge Regression

4 hours to complete
16 videos (Total 85 min), 5 readings, 3 quizzes
16 videos
Overfitting demo7m
Overfitting for more general multiple regression models3m
Balancing fit and magnitude of coefficients7m
The resulting ridge objective and its extreme solutions5m
How ridge regression balances bias and variance1m
Ridge regression demo9m
The ridge coefficient path4m
Computing the gradient of the ridge objective5m
Approach 1: closed-form solution6m
Discussing the closed-form solution5m
Approach 2: gradient descent9m
Selecting tuning parameters via cross validation3m
K-fold cross validation5m
How to handle the intercept6m
A brief recap1m
5 readings
Slides presented in this module10m
Download the notebook and follow along10m
Download the notebook and follow along10m
Observing effects of L2 penalty in polynomial regression10m
Implementing ridge regression via gradient descent10m
3 practice exercises
Ridge Regression30m
Observing effects of L2 penalty in polynomial regression30m
Implementing ridge regression via gradient descent30m

Reviews

TOP REVIEWS FROM MACHINE LEARNING: REGRESSION

View all reviews

About the Machine Learning Specialization

Machine Learning

Frequently Asked Questions

More questions? Visit the Learner Help Center.