Error Analysis in Beam Search

video-placeholder
Loading...
View Syllabus

Skills You'll Learn

Natural Language Processing, Long Short Term Memory (LSTM), Gated Recurrent Unit (GRU), Recurrent Neural Network, Attention Models

Reviews

4.8 (28,959 ratings)

  • 5 stars
    83.59%
  • 4 stars
    13.07%
  • 3 stars
    2.57%
  • 2 stars
    0.47%
  • 1 star
    0.28%

JY

Oct 29, 2018

The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.

NM

Feb 20, 2018

Hope can elaborate the backpropagation of RNN much more. BP through time is a bit tricky though we do not need to think about it during implementation using most of existing deep learning frameworks.

From the lesson

Sequence Models & Attention Mechanism

Augment your sequence models using an attention mechanism, an algorithm that helps your model decide where to focus its attention given a sequence of inputs. Then, explore speech recognition and how to deal with audio data.

Taught By

  • Placeholder

    Andrew Ng

    Instructor

  • Placeholder

    Kian Katanforoosh

    Senior Curriculum Developer

  • Placeholder

    Younes Bensouda Mourri

    Curriculum developer

Explore our Catalog

Join for free and get personalized recommendations, updates and offers.