Chevron Left
Back to Machine Learning: Classification

Learner Reviews & Feedback for Machine Learning: Classification by University of Washington

3,688 ratings

About the Course

Case Studies: Analyzing Sentiment & Loan Default Prediction In our case study on analyzing sentiment, you will create models that predict a class (positive/negative sentiment) from input features (text of the reviews, user profile information,...). In our second case study for this course, loan default prediction, you will tackle financial data, and predict when a loan is likely to be risky or safe for the bank. These tasks are an examples of classification, one of the most widely used areas of machine learning, with a broad array of applications, including ad targeting, spam detection, medical diagnosis and image classification. In this course, you will create classifiers that provide state-of-the-art performance on a variety of tasks. You will become familiar with the most successful techniques, which are most widely used in practice, including logistic regression, decision trees and boosting. In addition, you will be able to design and implement the underlying algorithms that can learn these models at scale, using stochastic gradient ascent. You will implement these technique on real-world, large-scale machine learning tasks. You will also address significant tasks you will face in real-world applications of ML, including handling missing data and measuring precision and recall to evaluate a classifier. This course is hands-on, action-packed, and full of visualizations and illustrations of how these techniques will behave on real data. We've also included optional content in every module, covering advanced topics for those who want to go even deeper! Learning Objectives: By the end of this course, you will be able to: -Describe the input and output of a classification model. -Tackle both binary and multiclass classification problems. -Implement a logistic regression model for large-scale classification. -Create a non-linear model using decision trees. -Improve the performance of any model using boosting. -Scale your methods with stochastic gradient ascent. -Describe the underlying decision boundaries. -Build a classification model to predict sentiment in a product review dataset. -Analyze financial data to predict loan defaults. -Use techniques for handling missing data. -Evaluate your models using precision-recall metrics. -Implement these techniques in Python (or in the language of your choice, though Python is highly recommended)....

Top reviews


Jun 14, 2020

A very deep and comprehensive course for learning some of the core fundamentals of Machine Learning. Can get a bit frustrating at times because of numerous assignments :P but a fun thing overall :)


Oct 15, 2016

Hats off to the team who put the course together! Prof Guestrin is a great teacher. The course gave me in-depth knowledge regarding classification and the math and intuition behind it. It was fun!

Filter by:

401 - 425 of 579 Reviews for Machine Learning: Classification

By Subhadip P

Aug 4, 2020


By Nicholas S

Oct 7, 2016


By 李真

Mar 5, 2016


By Vemuri s s n s d s

Jan 24, 2022



Dec 13, 2021



Oct 29, 2021


By boulealam c

Dec 15, 2020


By Saurabh A

Sep 11, 2020



Aug 21, 2020



Aug 16, 2020


By Dr S J

Jun 19, 2020



Nov 24, 2019



Aug 19, 2019


By Akash G

Mar 10, 2019


By xiaofeng y

Feb 5, 2017


By Kumiko K

Jun 5, 2016


By Maram A A

Dec 31, 2022


By Arun K P

Oct 17, 2018


By Navinkumar

Feb 23, 2017



Aug 12, 2016

The good:

-Good examples to learn the concepts

-Good organization of the material

-The assignments were well-explained and easy to follow-up

-The good humor and attitude of the professor makes the lectures very engaging

-All videolectures are small and this makes them easy to digest and follow (optional videos were large compared with the rest of the lectures but the material covered on those was pretty advanced and its length is justifiable)

Things that can be improved:

-In some of the videos the professor seemed to cruise through some of the concepts. I understand that it is recommended to take the series of courses in certain order but sometimes I felt we were rushing through the material covered

-I may be nitpicking here but I wish the professor used a different color to write on the slides (the red he used clashed horribly with some of the slides' backgrounds and made it difficult to read his observations)

Overall, a good course to take and very easy to follow if taken together with the other courses in the series.

By Hanif S

Jun 2, 2016

Highly recommended course, looking under the hood to examine how popular ML algorithms like decision trees and boosting are actually implemented. I'm surprised at how intuitive the idea of boosting really is. Also interesting that random forests are dismissed as not as powerful as boosting, but I would love to know why! Both methods appear to expose more data to the learner, and a heuristic comparison between RF and boosting would have been greatly appreciated.

One can immediately notice the difference between statistician Emily, who took us through the mathematical derivation of the derivative (ha.ha.) function for linear regression (much appreciated Emily!), and computer scientist Carlos, who skipped this bit for logistic regression but provided lots of verbose code to track the running of algorithms during assignments (helps to see what is actually happening under the hood). Excellent lecturers both, thank you!

By Amilkar A H M

Nov 27, 2017

It's a great course, but the programming assignments are a little too guided. That is good, to some extend, as it allows you to focus on the concepts, but at the same time, it leave little space for actually practicing your coding skills. I know they said from the beginning that this course was not focused on the implementation of the algorithms, however, how are you going to be able to use what you've learned without knowing how to implement the algorithms on your own.

When it comes to coding, nothing replace implementing the algorithms yourself. That is my only complaint. Other than that, it's great. I loved it. The concepts were well explained and they covered a lot of material. I wish they had spent more time in certain topics, but I guess this is just an introduction. Anyway, take this course by any means if you have some programming experience and have little to no machine learning knowledge.

By Daniel C

Apr 24, 2016

This series is taught by Emily and Carlos. Course 2 was Emily and this course 3 is Carlos. Carlos takes a more practical approach by showing how things are related using pictures, trial and error, what happens when we do this vs. that. Emily on the other hand dives down into the math and actual facts. I feel Emily is more difficult overall - but once I got through it, I had a better foundation and intuition as to how things work and better overall understanding. So - giving this class 4 stars as compared to Emily's class that is 5 stars. I feel if they would mix it with Emily doing the math immediately followed by Carlos explanations it would be best. Finally - I don't feel this course on classification had as much content. We could've done more.

By Jaiyam S

Apr 24, 2016

Thank you Prof. Carlos for this amazing course. You covered the topics in a very easy to understand way and the course was full of cool applications and humor! The only downside that I felt was that the programming assignments sometimes felt too easy. Even as a complete Python novice (I started learning Python with the first course), I felt the programming assignments could have been made more interesting. But in the larger scheme of things it doesn't matter because the course was really well taught and easy to understand. I'm really looking forward to the next course! :)

By Lech G

Apr 26, 2016

Not as good as the Regression Course, but still very good.

While I appreciate prof Guestrin's enthusiasm, I missed a little rigor and mathematical depth of the Regression's course by prof. Fox.

I learned a lot, but I feel that regression clicked with me a little better than classification.

But that's probably me.

In either case, the whole series are awesome so far, better, in my opinion, than Anrdrew Ng's ML course on coursera,

A small suggestion would be to switch the main toolset from the Graphlab to something more common, like Sci-kit learn and Pandas.