Data Analysis Using Pyspark

4.4
stars
94 ratings
Offered By
Coursera Project Network
4,301 already enrolled
In this Free Guided Project, you will:

Learn how to setup the google colab for distributed data processing

Learn applying different queries to your dataset to extract useful Information

Learn how to visualize this information using matplotlib

Showcase this hands-on experience in an interview

Clock1.5 h
IntermediateIntermediate
CloudNo download needed
VideoSplit-screen video
Comment DotsEnglish
LaptopDesktop only

One of the important topics that every data analyst should be familiar with is the distributed data processing technologies. As a data analyst, you should be able to apply different queries to your dataset to extract useful information out of it. but what if your data is so big that working with it on your local machine is not easy to be done. That is when the distributed data processing and Spark Technology will become handy. So in this project, we are going to work with pyspark module in python and we are going to use google colab environment in order to apply some queries to the dataset we have related to lastfm website which is an online music service where users can listen to different songs. This dataset is containing two csv files listening.csv and genre.csv. Also, we will learn how we can visualize our query results using matplotlib.

Requirements

Learners should be familiar with Python programming Language, Spark Technology and have a little experience working with google colab environment

Skills you will develop

Google colabData AnalysisPython ProgrammingpySpark SQL

Learn step-by-step

In a video that plays in a split-screen with your work area, your instructor will walk you through these steps:

  1. Prepare the Google Colab for distributed data processing

  2. Mounting our Google Drive into Google Colab environment

  3. Importing first file of our Dataset (1 Gb) into pySpark dataframe

  4. Applying some Queries to extract useful information out of our data

  5. Importing second file of our Dataset (3 Mb) into pySpark dataframe

  6. Joining two dataframes and prepapre it for more advanced queries

  7. Learn visualizing our query results using matplotlib

How Guided Projects work

Your workspace is a cloud desktop right in your browser, no download required

In a split-screen video, your instructor guides you step-by-step

Reviews

TOP REVIEWS FROM DATA ANALYSIS USING PYSPARK

View all reviews

Frequently asked questions

Frequently Asked Questions

More questions? Visit the Learner Help Center.