Logistic Regression with NumPy and Python

4.5
stars
340 ratings
Offered By
Coursera Project Network
10,108 already enrolled
In this Guided Project, you will:

Implement the gradient descent algorithm from scratch

Perform logistic regression with NumPy and Python

Create data visualizations with Matplotlib and Seaborn

Clock1.5 hours
BeginnerBeginner
CloudNo download needed
VideoSplit-screen video
Comment DotsEnglish
LaptopDesktop only

Welcome to this project-based course on Logistic with NumPy and Python. In this project, you will do all the machine learning without using any of the popular machine learning libraries such as scikit-learn and statsmodels. The aim of this project and is to implement all the machinery, including gradient descent, cost function, and logistic regression, of the various learning algorithms yourself, so you have a deeper understanding of the fundamentals. By the time you complete this project, you will be able to build a logistic regression model using Python and NumPy, conduct basic exploratory data analysis, and implement gradient descent from scratch. The prerequisites for this project are prior programming experience in Python and a basic understanding of machine learning theory. This course runs on Coursera's hands-on project platform called Rhyme. On Rhyme, you do projects in a hands-on manner in your browser. You will get instant access to pre-configured cloud desktops containing all of the software and data you need for the project. Everything is already set up directly in your internet browser so you can just focus on learning. For this project, you’ll get instant access to a cloud desktop with Python, Jupyter, NumPy, and Seaborn pre-installed.

Skills you will develop

Data ScienceMachine LearningPython ProgrammingclassificationNumpy

Learn step-by-step

In a video that plays in a split-screen with your work area, your instructor will walk you through these steps:

  1. Introduction and Project Overview

  2. Load the Data and Import Libraries

  3. Visualize the Data

  4. Define the Logistic Sigmoid Function 𝜎(𝑧)

  5. Compute the Cost Function 𝐽(𝜃) and Gradient

  6. Cost and Gradient at Initialization

  7. Implement Gradient Descent

  8. Plotting the Convergence of 𝐽(𝜃)

  9. Plotting the Decision Boundary

  10. Predictions Using the Optimized 𝜃 Values

How Guided Projects work

Your workspace is a cloud desktop right in your browser, no download required

In a split-screen video, your instructor guides you step-by-step

Reviews

TOP REVIEWS FROM LOGISTIC REGRESSION WITH NUMPY AND PYTHON

View all reviews

Frequently asked questions

Frequently Asked Questions

More questions? Visit the Learner Help Center.