Seq2seq

video-placeholder
Loading...
View Syllabus

Skills You'll Learn

Reformer Models, Neural Machine Translation, Chatterbot, T5+BERT Models, Attention Models

Reviews

4.3 (849 ratings)

  • 5 stars
    66.19%
  • 4 stars
    15.31%
  • 3 stars
    8.95%
  • 2 stars
    5.30%
  • 1 star
    4.24%

DB

Jan 24, 2023

I learned a lot from this course, and the ungraded and graded problems are relevant to understanding and knowing how to build a transformer or a reformer from scratch

AM

Oct 12, 2020

Great course! I really enjoyed extensive non-graded notebooks on LSH attention. Some content was pretty challenging, but always very rewarding!

Thank you!

From the lesson

Neural Machine Translation

Discover some of the shortcomings of a traditional seq2seq model and how to solve for them by adding an attention mechanism, then build a Neural Machine Translation model with Attention that translates English sentences into German.

Taught By

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Curriculum Architect

Explore our Catalog

Join for free and get personalized recommendations, updates and offers.