Prepare for a career in the high-growth field of data engineering. In this program, you’ll learn in-demand skills like Python, SQL, and Databases to get job-ready in less than 5 months.
Data engineering is building systems to gather data, process and organize raw data into usable information. Data engineers provide the foundational information that data scientists and business intelligence analysts use to make decisions.
This program will teach you the foundational data engineering skills employers are seeking for entry level data engineering roles, including Python, one of the most widely used programming languages. You’ll also master SQL, RDBMS, ETL, Data Warehousing, NoSQL, Big Data, and Spark with hands-on labs and projects.
You’ll learn to use Python programming language and Linux/UNIX shell scripts to extract, transform and load (ETL) data. You’ll work with Relational Databases (RDBMS) and query data using SQL statements and use NoSQL databases as well as unstructured data. You'll also learn how generative AI tools and techniques are used in data engineering.
Upon completion, you’ll have a portfolio of projects and a Professional Certificate from IBM to showcase your expertise. You’ll also earn an IBM Digital badge and will gain access to career resources to help you in your job search, including mock interviews and resume support.
This program is ACE® recommended—when you complete, you can earn up to 12 college credits.
Praktisches Lernprojekt
Throughout this Professional Certificate, you will complete hands-on labs and projects to help you gain practical experience with Python, SQL, relational databases, NoSQL databases, Apache Spark, building data pipelines, managing databases, and working with data warehouses.
Projects:
Design a relational database to help a coffee franchise improve operations.
Use SQL to query census, crime, and school demographic data sets.
Write a Bash shell script on Linux that backups changed files.
Set up, test, and optimize a data platform that contains MySQL, PostgreSQL, and IBM Db2 databases.
Analyze road traffic data to perform ETL and create a pipeline using Airflow and Kafka.
Design and implement a data warehouse for a solid-waste management company.
Move, query, and analyze data in MongoDB, Cassandra, and Cloudant NoSQL databases.
Train a machine learning model by creating an Apache Spark application.
Design, deploy, and manage an end-to-end data engineering platform.