Traditional Language models

video-placeholder
Loading...
View Syllabus

Skills You'll Learn

Word Embedding, Sentiment with Neural Nets, Siamese Networks, Natural Language Generation, Named-Entity Recognition

Reviews

4.5 (728 ratings)
  • 5 stars
    72.39%
  • 4 stars
    16.48%
  • 3 stars
    5.76%
  • 2 stars
    2.60%
  • 1 star
    2.74%
KT
Sep 24, 2020

The lectures are well planned--very short and to the point. The labs offer immense opportunity for practice, and assignment notebooks are well-written! Overall, the course is fantastic!

CR
Mar 20, 2021

I wish the neural networks would be described in greater detail.\n\nEverything else is really nice, Younes explains very well. Assignments are very nicely prepared.

From the lesson
Recurrent Neural Networks for Language Modeling
Learn about the limitations of traditional language models and see how RNNs and GRUs use sequential data for text prediction. Then build your own next-word generator using a simple RNN on Shakespeare text data!

Taught By

  • Placeholder

    Younes Bensouda Mourri

    Instructor
  • Placeholder

    Łukasz Kaiser

    Instructor
  • Placeholder

    Eddy Shyu

    Senior Curriculum Developer

Explore our Catalog

Join for free and get personalized recommendations, updates and offers.