Week Conclusion

video-placeholder
Loading...
View Syllabus

Skills You'll Learn

Word2vec, Parts-of-Speech Tagging, N-gram Language Models, Autocorrect

Reviews

4.7 (1,486 ratings)

  • 5 stars
    79.74%
  • 4 stars
    14.73%
  • 3 stars
    3.49%
  • 2 stars
    0.80%
  • 1 star
    1.21%

AH

Sep 28, 2020

Very good course! helped me clearly learn about Autocorrect, edit distance, Markov chains, n grams, perplexity, backoff, interpolation, word embeddings, CBOW. This was very helpful!

SR

Aug 4, 2021

Another great course introducing the probabilistic modelling concepts and slowly getting to the direction of computing neural networks. One must learn in detail how embedding works.

From the lesson

Autocomplete and Language Models

Learn about how N-gram language models work by calculating sequence probabilities, then build your own autocomplete language model using a text corpus from Twitter!

Taught By

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Curriculum Architect

Explore our Catalog

Join for free and get personalized recommendations, updates and offers.