Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
968 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

SB

Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

Filter by:

151 - 175 of 237 Reviews for Natural Language Processing with Attention Models

By Prakash K

Apr 29, 2023

easy to study

By Mohammad B A

Feb 26, 2021

I am so happy

By Parma R R

May 21, 2023

Good course!

By John M

Apr 14, 2023

It was great

By Teng L

Feb 18, 2023

Thank you!

By yuzhuobai

Dec 12, 2022

Thank you!

By Chen

Oct 27, 2021

Thank you!

By Sohail Z

Oct 17, 2020

AWESOME!!!

By Ehsan F

Sep 24, 2023

Fantastic

By LK Z

Oct 20, 2020

very good

By अनुभव त (

Sep 27, 2020

Very good

By Hoang Q T

Jul 25, 2022

Awesome!

By Justin H

Jul 12, 2023

Brutal.

By M n n

Nov 22, 2020

Awesome

By Alphin G I

Sep 14, 2023

Awsome

By Jeff D

Nov 15, 2020

Thanks

By Ayush S

May 25, 2023

good

By Md P

Apr 19, 2023

nice

By Pema W

Nov 11, 2022

good

By Rifat R

Sep 30, 2020

Best

By Thierry H

Oct 25, 2020

I was a bit disappointed by the fact that although 3 instructors are mentioned, in practice we only see Younes. Lukasz just says some intro and conclusion for each lesson, I would have liked seeing him really teach. And we don't see at all Eddy.

There are also some typos in the text in some notebooks and some slides but they don't hurt the quality of the course.

Overall the course is well made. I like the fact that it teaches recent architectures like Reformer. I was surprised that trax is used, at a time where the community is only starting to tame Tensorflow2. It would have been nice to have some words about where we are in the set of frameworks: why trax and how it compares to Tensorflow2, what's the trend and priority comparing TF vs trax (features support, flexibility, targeted audience, production readiness...etc...)

Some notebooks are only about filling the blanks with a big hint a couple lines before but I don't know how to make it more complex without leaving many people stuck, especially with a new framework like trax. I also liked the diagrams very much, especially for the last week with the complex transformations for LSH.

Quite a good course overall. Thanks!

By Brian G

Oct 31, 2020

I really enjoyed the course, thank you Younes, Lukasz, Eddy and all the staff of Deeplearning.ai and Coursera. I applaud your effort in trying to teach what seem to me cutting edge NLP techniques like Transformers, even though it is a very new and complex topic. The reason I didn't give you five stars is because I didn't feel the final course on Transformers and Reformers does not seem self contained, feels incomplete, a bit too haphazard for me, unlike the first 3 courses in the specialization. I don't feel enough foundation was covered for student to appreciate the topic being discussed or what choices led to the current design e.g. why Q, K, V and not just Q, V? Why not feed NER output as context instead of just the source input? I have to search for supplementary content on the internet to round out my understanding.

By David M

Apr 2, 2023

I think the labs could either be a bit less cook-book (though it was satisfying to work through them successfully) or go into greater depth about the mechanisms at work. The ungraded labs in the last course of the specialty were *great*, and I think captured the right balance. More labs like that, please.

I come away with a decent understanding of what Attention models are, but I'm not really sure how they do what they do, with the end result that they seem a bit magical. The same is true of Locality Sensitive Hashing, *why* should similar items hash to the same bucket.

Well, I'll spend some time investigating these on my own.

By Laurenz E

Jan 19, 2023

The course explores important concepts. It felt this week was however, less polished than the previous weeks. I was missing the summaries after the videos, and the reason for some concepts (what is Query / Key / Value, or why is bert only the encoder and gpt only the decoder and why the not both combined).

It is of course still a great course and I learned a lot from it. Thanks for taking the time and effort for creating it. It is really helpful to have this kind of material in the professional way you explain it.