Analysing Unstructured Data using MongoDB and PySpark

Offered By
Coursera Project Network
In this Guided Project, you will:

Learn how to connect MongoDB database with PySpark

Learn how to analyse unstructured dataset stored in MongoDB

Learn how to write Spark DataFrames to CSV or MongoDB

Clock1.5 hours
BeginnerBeginner
CloudNo download needed
VideoSplit-screen video
Comment DotsEnglish
LaptopDesktop only

By the end of this project, you will learn how to analyze unstructured data stored in MongoDB using PySpark. We will be using an open source dataset containing information on movies released around the world. I will teach you how to connect a MongoDB database with PySpark, how to analyze unstructured dataset stored in MongoDB, and how to write the analyses results to a CSV file or back to MongoDB. I will also teach you how to access inner (or nested) documents and how to run SQL queries on a MongoDB collection. You will create a ready-to-use Jupyter notebook for conducting analyses on MongoDB collections using PySpark. After completing the project, you will receive a Zip file containing links to other open source datasets for additional practice! MongoDB is one of the most commonly used databases for storing unstructured datasets. As the size of the dataset grows, it is becoming more practical to use Spark’s analytical engine for analyses. These analyses could range from basic descriptive statistics metrics to more advanced machine learning and deep learning capabilities, all utilizing the vast library of Spark. This is a beginner level course where we will cover the basics of MongoDB and PySpark. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Skills you will develop

Unstructured DataBig DataMongodbPySpark

Learn step-by-step

In a video that plays in a split-screen with your work area, your instructor will walk you through these steps:

  1. Upload data to MongoDB Database

  2. Connect to MongoDB using PySpark

  3. Analyse MongoDB collection and access nested documents

  4. Write Spark Dataframe to CSV

  5. Run SQL query on MongoDB collection

  6. Write Spark Dataframe to MongoDB

How Guided Projects work

Your workspace is a cloud desktop right in your browser, no download required

In a split-screen video, your instructor guides you step-by-step

Frequently asked questions

Frequently Asked Questions

More questions? Visit the Learner Help Center.