One of the most useful areas in machine learning is discovering hidden patterns from unlabeled data. Add the fundamentals of this in-demand skill to your Data Science toolkit. In this course, we will learn selected unsupervised learning methods for dimensionality reduction, clustering, and learning latent features. We will also focus on real-world applications such as recommender systems with hands-on examples of product recommendation algorithms.


Unsupervised Algorithms in Machine Learning
This course is part of Machine Learning: Theory and Hands-on Practice with Python Specialization

Instructor: Geena Kim
1,679 already enrolled
Course
Recommended experience
What you'll learn
Explain what unsupervised learning is, and list methods used in unsupervised learning.
List and explain algorithms for various matrix factorization methods, and what each is used for.
List and explain algorithms for various matrix factorization methods, and what each is used for.
Skills you'll gain
- Category: Dimensionality Reduction
- Category: Unsupervised Learning
- Category: Cluster Analysis
- Category: Recommender Systems
- Category: Matrix Factorization
Details to know

Add to your LinkedIn profile
2 quizzes, 4 assessments
Course
Recommended experience
Build your subject-matter expertise
- Learn new concepts from industry experts
- Gain a foundational understanding of a subject or tool
- Develop job-relevant skills with hands-on projects
- Earn a shareable career certificate

There are 4 modules in this course
Now that you have a solid foundation in Supervised Learning, we shift our attention to uncovering the hidden structure from unlabeled data. We will start with an introduction to Unsupervised Learning. In this course, the models no longer have labels to learn from. They need to make sense of the data from the observations themselves. This week we are diving into Principal Component Analysis, PCA, a foundational dimension reduction technique. When you first start learning this topic, it might not seem easy. There is undoubtedly some math involved in this section. However, PCA can be grasped conceptually, perhaps more readily than anticipated. In the Supervised Learning course, we struggled with the Curse of Dimensionality. This week, we will see how PCA can reduce the number of dimensions and improve classification/regression tasks. You will have reading, a quiz, and a Jupyter notebook lab/Peer Review to implement the PCA algorithm.
What's included
3 videos9 readings2 quizzes
This week, we are working with clustering, one of the most popular unsupervised learning methods. Last week, we used PCA to find a low-dimensional representation of data. Clustering, on the other hand, finds subgroups among observations. We can get a meaningful intuition of the data structure or use a procedure like Cluster-then-predict. Clustering has several applications ranging from marketing customer segmentation and advertising, identifying similar movies/music, to genomics research and disease subtypes discovery. We will focus our efforts mainly on K-means clustering and hierarchical clustering with consideration to the benefits and disadvantages of both and the choice of metrics like distance or linkage. We have reading, a quiz, and a Jupyter notebook lab/Peer Review this week.
What's included
2 videos2 readings
This week we are working with Recommender Systems. Websites like Netflix, Amazon, and YouTube will surface personalized recommendations for movies, items, or videos. This week, we explore Recommendation Engines' strategies to predict users' likes. We will consider popularity, content-based, and collaborative filtering approaches, and what similarity metrics to use. As we work with Recommendation Systems, there are challenges, like the time complexity of operations and sparse data. This week is relatively math dense. You will have a quiz wherein you will work with different similarity metric calculations. Give yourself time for this week's Jupyter notebook lab and consider performant implementations. The Peer Review section this week is short.
What's included
4 videos1 reading
We are already at the last week of course material! Get ready for another dense math week. Last week, we learned about Recommendation Systems. We used a Neighborhood Method of Collaborative Filtering, utilizing similarity measures. Latent Factor Models, including the popular Matrix Factorization (MF), can also be used for Collaborative Filtering. A 1999 publication in Nature made Non-negative Matrix Factorization extremely popular. MF has many applications, including image analysis, text mining/topic modeling, Recommender systems, audio signal separation, analytic chemistry, and gene expression analysis. For this week, we focus on Singular Value Decomposition, Non-negative Matrix Factorization, and Approximation methods. This week, we have reading, a quiz, and a Kaggle mini-project utilizing matrix factorization to categorize news articles.
What's included
5 videos1 reading
Instructor

Offered by

Earn a career certificate
Add this credential to your LinkedIn profile, resume, or CV
Share it on social media and in your performance review

Get a head start on your degree
This course is part of the following online degree programs offered by University of Colorado Boulder. If you apply and are accepted, your coursework can count toward your degree learning and all of your progress will transfer with you.
Why people choose Coursera for their career




Recommended if you're interested in Data Science

Open new doors with Coursera Plus
Unlimited access to 7,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Advance your career with an online degree
Earn a degree from world-class universities - 100% online
Join over 3,400 global companies that choose Coursera for Business
Upskill your employees to excel in the digital economy
Frequently asked questions
Access to lectures and assignments depends on your type of enrollment. If you take a course in audit mode, you will be able to see most course materials for free. To access graded assignments and to earn a Certificate, you will need to purchase the Certificate experience, during or after your audit. If you don't see the audit option:
The course may not offer an audit option. You can try a Free Trial instead, or apply for Financial Aid.
The course may offer 'Full Course, No Certificate' instead. This option lets you see all course materials, submit required assessments, and get a final grade. This also means that you will not be able to purchase a Certificate experience.
When you enroll in the course, you get access to all of the courses in the Specialization, and you earn a certificate when you complete the work. Your electronic Certificate will be added to your Accomplishments page - from there, you can print your Certificate or add it to your LinkedIn profile. If you only want to read and view the course content, you can audit the course for free.
Yes. In select learning programs, you can apply for financial aid or a scholarship if you can’t afford the enrollment fee. If fin aid or scholarship is available for your learning program selection, you’ll find a link to apply on the description page.