Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

750 ratings
185 reviews

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews


Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks


Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

Filter by:

101 - 125 of 186 Reviews for Natural Language Processing with Attention Models

By Pranay R

Nov 13, 2020

Excellent content

By Carlos A C G

Jan 17, 2022

Incredibly good!

By Fakhre A

May 4, 2021

Great course ..

By Jyotin P

Nov 22, 2020

Amazing course!

By Pragya H

Jun 8, 2021

Awesome Course

By Phylypo T

Dec 14, 2020

Great courses.

By Mohammad B A

Feb 26, 2021

I am so happy

By Chen

Oct 27, 2021

Thank you!

By Sohail Z

Oct 17, 2020



Oct 20, 2020

very good

By अनुभव त

Sep 27, 2020

Very good

By M n n

Nov 22, 2020


By Jeff D

Nov 15, 2020


By Rifat R

Sep 30, 2020


By Thierry H

Oct 25, 2020

I was a bit disappointed by the fact that although 3 instructors are mentioned, in practice we only see Younes. Lukasz just says some intro and conclusion for each lesson, I would have liked seeing him really teach. And we don't see at all Eddy.

There are also some typos in the text in some notebooks and some slides but they don't hurt the quality of the course.

Overall the course is well made. I like the fact that it teaches recent architectures like Reformer. I was surprised that trax is used, at a time where the community is only starting to tame Tensorflow2. It would have been nice to have some words about where we are in the set of frameworks: why trax and how it compares to Tensorflow2, what's the trend and priority comparing TF vs trax (features support, flexibility, targeted audience, production readiness...etc...)

Some notebooks are only about filling the blanks with a big hint a couple lines before but I don't know how to make it more complex without leaving many people stuck, especially with a new framework like trax. I also liked the diagrams very much, especially for the last week with the complex transformations for LSH.

Quite a good course overall. Thanks!

By Brian G

Oct 31, 2020

I really enjoyed the course, thank you Younes, Lukasz, Eddy and all the staff of and Coursera. I applaud your effort in trying to teach what seem to me cutting edge NLP techniques like Transformers, even though it is a very new and complex topic. The reason I didn't give you five stars is because I didn't feel the final course on Transformers and Reformers does not seem self contained, feels incomplete, a bit too haphazard for me, unlike the first 3 courses in the specialization. I don't feel enough foundation was covered for student to appreciate the topic being discussed or what choices led to the current design e.g. why Q, K, V and not just Q, V? Why not feed NER output as context instead of just the source input? I have to search for supplementary content on the internet to round out my understanding.

By Simon P

Dec 6, 2020

The course could have been expanded to an entire specialization. There's a little too much information and the first two assignments are disproportionately long and hard compared with the last two. It is cutting edge material though, and well worth it.

Slight annoyance at the script reading meaning the videos lack a natural flow and you end up with nonsense sentences like "now we multiply double-uwe sub en superscript dee by kay sub eye superscript jay to get vee sub eye". Variables such as X_i should be referred to by what they actually represent and not the algebraic representation, because this is not how the brain processes them when they are read from a page.

By Dave J

May 3, 2021

The content is interesting and current, citing some 2020 papers. I was disappointed by the amount of lecture material - around 40-45 minutes per week in weeks 1-3 and only 20 minutes in week 4, plus two Heroes of NLP interviews. The lectures have the feel of reading from a script rather than engaging with the learner. They're not bad but there's room for improvement. Explanations are usually adequate but some areas could have been explained more clearly.

Programming assignments worked smoothly in my experience, though not particularly challenging: they're largely "painting by numbers".

By Woosung Y

Nov 7, 2020

Great course for the understanding basic concept of attention module in NLP. What I learned in this course mainly based on text data processing. (I feel that the voice or sound data will be a little different to apply.) I was able to make a solid understanding through practical examples.

One thing that I felt was lacking is

There is no theoretical background on the convergence. I don't understand why such NLP model can be converge to optimal solution. It may work. But why? I need to search more literature.

By Fan P

Nov 18, 2020

The materials in Week 3 are not sufficiently clear to explain the BERT model. The instructors sometimes repeated itself and only explain the surface but never going inside. The assignment of week 3 was designed lack of comprehension. One thing is very interesting but didn't mention is that how BERT model can achieve to be trained by the self-supervised method on almost any dataset. Would be good to show how does BERT prepare the training dataset, can it be generalized to other types of dataset?

By Luis F C d L

Apr 3, 2022

The course is one of a kind in the sense that there's very limited courses that try to aboard the transformers/attention subject.

I really enjoyed the content but there were some times that I'd like to have some more depth about the implementation side of things...

I know the subject is very complex so in general I apreciate the efforts of putting that kind of content for us. Thanks to that I rope I can evolve the models we use in my company to these state of the art transformers.

By Jerry C

Nov 3, 2020

Overall the course is a nice introduction to new or cutting-edge NLP techniques using deep learning with good explanations and diagrams. The course is a bit too easy in terms of hand-holding; a large part of the assignments can be easily completed given the hints without deeply understanding what is going on. Also, occasionally there are typos or incoherent wording which detract from the overall experience.

By Stephen S

Jun 21, 2021

C​ontent wise it's excellent as always, I am not giving 5 stars, because of two reasons: a) audio including transcript is sometimes not of best quality (in english) as it would be generated by a machine b) readings are very brief and just quickly summarizing what has been taught in the video (could go in more depth). I would give 4,5 stars if that would be possible.

By Amey N

Oct 4, 2020

The course gives an encompassing overview of the latest tools and technologies which are driving the NLP domain. Thus, the focus gradually shifts from implementation and towards design.

Since the models require specialized equipment, they go beyond the scope of a personal computer and create a requirement for high-performance computing.