About this Course

295,193 recent views
Shareable Certificate
Earn a Certificate upon completion
100% online
Start instantly and learn at your own schedule.
Flexible deadlines
Reset deadlines in accordance to your schedule.
Intermediate Level
Approx. 24 hours to complete
English

Skills you will gain

Word2vecParts-of-Speech TaggingN-gram Language ModelsAutocorrect
Shareable Certificate
Earn a Certificate upon completion
100% online
Start instantly and learn at your own schedule.
Flexible deadlines
Reset deadlines in accordance to your schedule.
Intermediate Level
Approx. 24 hours to complete
English

Offered by

Placeholder

deeplearning.ai

Syllabus - What you will learn from this course

Content RatingThumbs Up92%(1,743 ratings)Info
Week
1

Week 1

6 hours to complete

Autocorrect

6 hours to complete
9 videos (Total 27 min), 2 readings, 1 quiz
9 videos
Overview1m
Autocorrect2m
Building the model3m
Building the model II2m
Minimum edit distance2m
Minimum edit distance algorithm5m
Minimum edit distance algorithm II3m
Minimum edit distance algorithm III2m
2 readings
Connect with your mentors and fellow learners on Slack!10m
How to Refresh your Workspace10m
Week
2

Week 2

4 hours to complete

Part of Speech Tagging and Hidden Markov Models

4 hours to complete
11 videos (Total 38 min)
11 videos
Markov Chains3m
Markov Chains and POS Tags4m
Hidden Markov Models3m
Calculating Probabilities3m
Populating the Transition Matrix4m
Populating the Emission Matrix2m
The Viterbi Algorithm3m
Viterbi: Initialization2m
Viterbi: Forward Pass2m
Viterbi: Backward Pass5m
Week
3

Week 3

7 hours to complete

Autocomplete and Language Models

7 hours to complete
9 videos (Total 50 min)
9 videos
N-grams and Probabilities7m
Sequence Probabilities5m
Starting and Ending Sentences8m
The N-gram Language Model6m
Language Model Evaluation6m
Out of Vocabulary Words4m
Smoothing6m
Week Summary1m
Week
4

Week 4

7 hours to complete

Word embeddings with neural networks

7 hours to complete
20 videos (Total 65 min), 1 reading, 1 quiz
20 videos
Basic Word Representations3m
Word Embeddings3m
How to Create Word Embeddings3m
Word Embedding Methods3m
Continuous Bag-of-Words Model3m
Cleaning and Tokenization4m
Sliding Window of Words in Python3m
Transforming Words into Vectors2m
Architecture of the CBOW Model3m
Architecture of the CBOW Model: Dimensions3m
Architecture of the CBOW Model: Dimensions 22m
Architecture of the CBOW Model: Activation Functions4m
Training a CBOW Model: Cost Function4m
Training a CBOW Model: Forward Propagation3m
Training a CBOW Model: Backpropagation and Gradient Descent4m
Extracting Word Embedding Vectors2m
Evaluating Word Embeddings: Intrinsic Evaluation3m
Evaluating Word Embeddings: Extrinsic Evaluation2m
Conclusion2m
1 reading
Acknowledgments10m

Reviews

TOP REVIEWS FROM NATURAL LANGUAGE PROCESSING WITH PROBABILISTIC MODELS

View all reviews

About the Natural Language Processing Specialization

Natural Language Processing

Frequently Asked Questions

More questions? Visit the Learner Help Center.