About this Course

177,413 recent views
Shareable Certificate
Earn a Certificate upon completion
100% online
Start instantly and learn at your own schedule.
Flexible deadlines
Reset deadlines in accordance to your schedule.
Intermediate Level
Approx. 25 hours to complete
English

Skills you will gain

Reformer ModelsNeural Machine TranslationChatterbotT5+BERT ModelsAttention Models
Shareable Certificate
Earn a Certificate upon completion
100% online
Start instantly and learn at your own schedule.
Flexible deadlines
Reset deadlines in accordance to your schedule.
Intermediate Level
Approx. 25 hours to complete
English

Offered by

Placeholder

deeplearning.ai

Syllabus - What you will learn from this course

Week
1

Week 1

6 hours to complete

Neural Machine Translation

6 hours to complete
8 videos (Total 46 min), 5 readings, 1 quiz
8 videos
Seq2seq4m
Alignment4m
Attention6m
Setup for Machine Translation3m
Training an NMT with Attention6m
Evaluation for Machine Translation8m
Sampling and Decoding9m
5 readings
Connect with your mentors and fellow learners on Slack!10m
(Optional): The Real Meaning of Ich Bin ein Berliner10m
(Optional) What is Teacher Forcing?10m
Content Resource10m
How to Refresh your Workspace10m
Week
2

Week 2

6 hours to complete

Text Summarization

6 hours to complete
7 videos (Total 43 min), 1 reading, 1 quiz
7 videos
Transformer Applications8m
Dot-Product Attention7m
Causal Attention4m
Multi-head Attention6m
Transformer Decoder5m
Transformer Summarizer4m
1 reading
Content Resource10m
Week
3

Week 3

7 hours to complete

Question Answering

7 hours to complete
10 videos (Total 45 min), 1 reading, 1 quiz
10 videos
Transfer Learning in NLP7m
ELMo, GPT, BERT, T57m
Bidirectional Encoder Representations from Transformers (BERT)4m
BERT Objective2m
Fine tuning BERT2m
Transformer: T53m
Multi-Task Training Strategy5m
GLUE Benchmark2m
Question Answering2m
1 reading
Content Resource10m
Week
4

Week 4

7 hours to complete

Chatbot

7 hours to complete
6 videos (Total 21 min), 5 readings, 1 quiz
6 videos
Transformer Complexity3m
LSH Attention4m
Motivation for Reversible Layers: Memory! 2m
Reversible Residual Layers 5m
Reformer2m
5 readings
Optional AI Storytelling15m
Optional KNN & LSH Review20m
Optional Transformers beyond NLP20m
Acknowledgments10m
References10m

About the Natural Language Processing Specialization

Natural Language Processing

Frequently Asked Questions

More questions? Visit the Learner Help Center.