About this Course

182,027 recent views
Shareable Certificate
Earn a Certificate upon completion
100% online
Start instantly and learn at your own schedule.
Flexible deadlines
Reset deadlines in accordance to your schedule.
Intermediate Level
Approx. 28 hours to complete
English

Skills you will gain

Reformer ModelsNeural Machine TranslationChatterbotT5+BERT ModelsAttention Models
Shareable Certificate
Earn a Certificate upon completion
100% online
Start instantly and learn at your own schedule.
Flexible deadlines
Reset deadlines in accordance to your schedule.
Intermediate Level
Approx. 28 hours to complete
English

Offered by

Placeholder

DeepLearning.AI

Syllabus - What you will learn from this course

Content RatingThumbs Up80%(1,793 ratings)Info
Week
1

Week 1

7 hours to complete

Neural Machine Translation

7 hours to complete
9 videos (Total 81 min), 9 readings, 1 quiz
9 videos
Seq2seq4m
Alignment4m
Attention6m
Setup for Machine Translation3m
Training an NMT with Attention6m
Evaluation for Machine Translation8m
Sampling and Decoding9m
Andrew Ng with Oren Etzioni34m
9 readings
Connect with your mentors and fellow learners on Slack!10m
Background on seq2seq10m
(Optional): The Real Meaning of Ich Bin ein Berliner10m
Attention10m
Training an NMT with Attention10m
(Optional) What is Teacher Forcing?10m
Evaluation for Machine Translation10m
Content Resource10m
How to Refresh your Workspace10m
Week
2

Week 2

7 hours to complete

Text Summarization

7 hours to complete
7 videos (Total 43 min), 7 readings, 1 quiz
7 videos
Transformer Applications8m
Dot-Product Attention7m
Causal Attention4m
Multi-head Attention6m
Transformer Decoder5m
Transformer Summarizer4m
7 readings
Transformers vs RNNs10m
Transformer Applications10m
Dot-Product Attention10m
Causal Attention10m
Transformer Decoder10m
Transformer Summarizer10m
Content Resource10m
Week
3

Week 3

7 hours to complete

Question Answering

7 hours to complete
10 videos (Total 45 min), 1 reading, 1 quiz
10 videos
Transfer Learning in NLP7m
ELMo, GPT, BERT, T57m
Bidirectional Encoder Representations from Transformers (BERT)4m
BERT Objective2m
Fine tuning BERT2m
Transformer: T53m
Multi-Task Training Strategy5m
GLUE Benchmark2m
Question Answering2m
1 reading
Content Resource10m
Week
4

Week 4

7 hours to complete

Chatbot

7 hours to complete
7 videos (Total 62 min), 5 readings, 1 quiz
7 videos
Transformer Complexity3m
LSH Attention4m
Motivation for Reversible Layers: Memory! 2m
Reversible Residual Layers 5m
Reformer2m
Andrew Ng with Quoc Le40m
5 readings
Optional AI Storytelling15m
Optional KNN & LSH Review20m
Optional Transformers beyond NLP20m
Acknowledgments10m
References10m

Reviews

TOP REVIEWS FROM NATURAL LANGUAGE PROCESSING WITH ATTENTION MODELS

View all reviews

About the Natural Language Processing Specialization

Natural Language Processing

Frequently Asked Questions

More questions? Visit the Learner Help Center.