Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.3
stars
686 ratings
170 reviews

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH
Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

LL
Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

Filter by:

101 - 125 of 170 Reviews for Natural Language Processing with Attention Models

By Chen

Oct 27, 2021

Thank you!

By Sohail Z

Oct 17, 2020

AWESOME!!!

By LK Z

Oct 20, 2020

very good

By अनुभव त

Sep 27, 2020

Very good

By M n n

Nov 22, 2020

Awesome

By Jeff D

Nov 15, 2020

Thanks

By Rifat R

Sep 30, 2020

Best

By Thierry H

Oct 25, 2020

I was a bit disappointed by the fact that although 3 instructors are mentioned, in practice we only see Younes. Lukasz just says some intro and conclusion for each lesson, I would have liked seeing him really teach. And we don't see at all Eddy.

There are also some typos in the text in some notebooks and some slides but they don't hurt the quality of the course.

Overall the course is well made. I like the fact that it teaches recent architectures like Reformer. I was surprised that trax is used, at a time where the community is only starting to tame Tensorflow2. It would have been nice to have some words about where we are in the set of frameworks: why trax and how it compares to Tensorflow2, what's the trend and priority comparing TF vs trax (features support, flexibility, targeted audience, production readiness...etc...)

Some notebooks are only about filling the blanks with a big hint a couple lines before but I don't know how to make it more complex without leaving many people stuck, especially with a new framework like trax. I also liked the diagrams very much, especially for the last week with the complex transformations for LSH.

Quite a good course overall. Thanks!

By Brian G

Oct 31, 2020

I really enjoyed the course, thank you Younes, Lukasz, Eddy and all the staff of Deeplearning.ai and Coursera. I applaud your effort in trying to teach what seem to me cutting edge NLP techniques like Transformers, even though it is a very new and complex topic. The reason I didn't give you five stars is because I didn't feel the final course on Transformers and Reformers does not seem self contained, feels incomplete, a bit too haphazard for me, unlike the first 3 courses in the specialization. I don't feel enough foundation was covered for student to appreciate the topic being discussed or what choices led to the current design e.g. why Q, K, V and not just Q, V? Why not feed NER output as context instead of just the source input? I have to search for supplementary content on the internet to round out my understanding.

By Simon P

Dec 6, 2020

The course could have been expanded to an entire specialization. There's a little too much information and the first two assignments are disproportionately long and hard compared with the last two. It is cutting edge material though, and well worth it.

Slight annoyance at the script reading meaning the videos lack a natural flow and you end up with nonsense sentences like "now we multiply double-uwe sub en superscript dee by kay sub eye superscript jay to get vee sub eye". Variables such as X_i should be referred to by what they actually represent and not the algebraic representation, because this is not how the brain processes them when they are read from a page.

By Dave J

May 3, 2021

The content is interesting and current, citing some 2020 papers. I was disappointed by the amount of lecture material - around 40-45 minutes per week in weeks 1-3 and only 20 minutes in week 4, plus two Heroes of NLP interviews. The lectures have the feel of reading from a script rather than engaging with the learner. They're not bad but there's room for improvement. Explanations are usually adequate but some areas could have been explained more clearly.

Programming assignments worked smoothly in my experience, though not particularly challenging: they're largely "painting by numbers".

By Woosung Y

Nov 7, 2020

Great course for the understanding basic concept of attention module in NLP. What I learned in this course mainly based on text data processing. (I feel that the voice or sound data will be a little different to apply.) I was able to make a solid understanding through practical examples.

One thing that I felt was lacking is

There is no theoretical background on the convergence. I don't understand why such NLP model can be converge to optimal solution. It may work. But why? I need to search more literature.

By Fan P

Nov 18, 2020

The materials in Week 3 are not sufficiently clear to explain the BERT model. The instructors sometimes repeated itself and only explain the surface but never going inside. The assignment of week 3 was designed lack of comprehension. One thing is very interesting but didn't mention is that how BERT model can achieve to be trained by the self-supervised method on almost any dataset. Would be good to show how does BERT prepare the training dataset, can it be generalized to other types of dataset?

By Jerry C

Nov 3, 2020

Overall the course is a nice introduction to new or cutting-edge NLP techniques using deep learning with good explanations and diagrams. The course is a bit too easy in terms of hand-holding; a large part of the assignments can be easily completed given the hints without deeply understanding what is going on. Also, occasionally there are typos or incoherent wording which detract from the overall experience.

By Stephen S

Jun 21, 2021

C​ontent wise it's excellent as always, I am not giving 5 stars, because of two reasons: a) audio including transcript is sometimes not of best quality (in english) as it would be generated by a machine b) readings are very brief and just quickly summarizing what has been taught in the video (could go in more depth). I would give 4,5 stars if that would be possible.

By Amey N

Oct 4, 2020

The course gives an encompassing overview of the latest tools and technologies which are driving the NLP domain. Thus, the focus gradually shifts from implementation and towards design.

Since the models require specialized equipment, they go beyond the scope of a personal computer and create a requirement for high-performance computing.

By Ankit K S

Nov 30, 2020

This is really an interesting specialization with lots of things to learn in the domain of NLP ranging from basic to advanced concepts. It covers the state of the art Transformer architecture in great detail. The only thing with which I felt uncomfortable is the use of Trax Library in assignments.

By A V A

Nov 20, 2020

Covers the state of the art in NLP! We get an overview and a basic understanding of designing and using attention models. Each week deserves to be a course in itself - could have actually designed a specialization on the attention based models so that we get to learn and understand better.

By Naman B

Apr 28, 2021

It would have been better if we use standard frameworks like PyTorch instead of Trax. Also, the Course Videos are a bit confusing at times. It would have been great if the Math part would have been taught as Andrew Ng Taught in Deep Learning Course.

By Cees R

Nov 29, 2020

Not new to NLP, I enjoyed this course and learned things I didn't know before. From an educational perspective, I didn't like that the two "optional" exercises were way harder than the too easy "fill in x here" assignment.

By ZIcong M

Dec 14, 2020

Overall good quality, but seems a bit short and content are squeezed.

I don't like the push of Trax neither, it is has yet become the mainstream and personally I don't find that helpful for my professional career.

By Gonzalo A M

Jan 21, 2021

I think that we could go deeper in the last course because you taught a lot of complex concepts but I did not feel confidence to replicate them. It was better to explain transformers with more detail

By CLAUDIA R R

Sep 7, 2021

It's a great course, more difficult than I thought but very well structured and explained. Although more didactic free videos can complete the lessons from others websites.

By Anand K

Oct 15, 2020

great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate

By Moustafa S

Oct 3, 2020

good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job