Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
984 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

LL

Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

Filter by:

151 - 175 of 240 Reviews for Natural Language Processing with Attention Models

By Pragya H

Jun 8, 2021

Awesome Course

By Phylypo T

Dec 14, 2020

Great courses.

By Eduardo S

Aug 9, 2023

Great course!

By Prakash K

Apr 29, 2023

easy to study

By Mohammad B A

Feb 26, 2021

I am so happy

By Parma R R

May 21, 2023

Good course!

By John M

Apr 14, 2023

It was great

By Teng L

Feb 18, 2023

Thank you!

By yuzhuobai

Dec 12, 2022

Thank you!

By Chen

Oct 27, 2021

Thank you!

By Sohail Z

Oct 17, 2020

AWESOME!!!

By Ehsan F

Sep 24, 2023

Fantastic

By LK Z

Oct 20, 2020

very good

By अनुभव त (

Sep 27, 2020

Very good

By Hoang Q T

Jul 25, 2022

Awesome!

By Justin H

Jul 12, 2023

Brutal.

By M n n

Nov 22, 2020

Awesome

By Alphin G I

Sep 14, 2023

Awsome

By Jeff D

Nov 15, 2020

Thanks

By Ayush S

May 25, 2023

good

By Md P

Apr 19, 2023

nice

By Pema W

Nov 11, 2022

good

By Rifat R

Sep 30, 2020

Best

By Thierry H

Oct 25, 2020

I was a bit disappointed by the fact that although 3 instructors are mentioned, in practice we only see Younes. Lukasz just says some intro and conclusion for each lesson, I would have liked seeing him really teach. And we don't see at all Eddy.

There are also some typos in the text in some notebooks and some slides but they don't hurt the quality of the course.

Overall the course is well made. I like the fact that it teaches recent architectures like Reformer. I was surprised that trax is used, at a time where the community is only starting to tame Tensorflow2. It would have been nice to have some words about where we are in the set of frameworks: why trax and how it compares to Tensorflow2, what's the trend and priority comparing TF vs trax (features support, flexibility, targeted audience, production readiness...etc...)

Some notebooks are only about filling the blanks with a big hint a couple lines before but I don't know how to make it more complex without leaving many people stuck, especially with a new framework like trax. I also liked the diagrams very much, especially for the last week with the complex transformations for LSH.

Quite a good course overall. Thanks!