Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.5
stars
309 ratings
79 reviews

About the Course

In Course 4 of the Natural Language Processing Specialization, offered by DeepLearning.AI, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. This course is for students of machine learning or artificial intelligence as well as software engineers looking for a deeper understanding of how NLP models work and how to apply them. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH
Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

SB
Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

Filter by:

1 - 25 of 79 Reviews for Natural Language Processing with Attention Models

By Xu O

Sep 26, 2020

The concept is not clearly explained at all. The instructor seems to be just reading a script. He did not try to explain the math. Instead, he uses graphs to try to fool us. The other instructor hardly teaches anything but just to show his face and say a few openning setences. I took Andrew Ng's courses and was impressed, but I am very disappointed by the quality of this course. Deeplearning.ai, please have some quality control over the courses you offer, otherwise it hurts your brand name!

By Lucas F

Sep 27, 2020

The course is rather disappointing. Videos are short. They give you an intuition, why something works but don't go much into the details. When teacher said "Now you are an expert in transformers" it sounds like a mockery. The course material is split into four weeks, however you can obtain certificate after spending a few days.

Homeworks won't teach you much. For you to understand, by now the most hard exercise according to course's Slack is to write a function with model and input tokens as input, which should predict next token. It's body contains only 8 lines of code, some of them is already given, your task is well explained.

Trax, a deep learning framework, that is used in homeworks might be a great framework, but not for learners. All you need to do, is just to pick a layer, put it in right place and

voila. But instructions makes a situation even worse. It is so detailed, that you can just copy a code from instructions, paste it into your code and obtain a working solution. Sometimes you should look at documentation just to see the argument's name. You won't have to think about dimensions, you won't have to think about structure of a model. When you decide to write a transformer from scratch with Pytorch then, you will struggle hard, but the price is much deeper understanding.

Would I recommend taking this course? I think, that course team did a nice work to provide you an overview of the state of the art techniques in NLP. Some references are amazing. So if you treat this course like intorductory, you could take it. But don't expect too much. When you are said, that you will "build a chatbot using a Reformer model" take in mind that the crucial skill to do it, is just a copy-pasting.

By Shikhin M

Sep 28, 2020

Superficial coverage of topics, lack of mathematical depth and sophistication. Dumbing down and simplification never help.

By Kabakov B

Sep 25, 2020

the NLP spec course has ~30min video on every week, and sum-ups are ~1/4 of it. Thus, one cannot expect a good and profound theory knowledge, only some intuition and insights.Without theory, it can be expected that program tasks should contain something practical and superficial. Like crash-course into the most popular packages in the field. But tasks are huge – x6 time more than a theory – and boring. A lot of spaghetti code with few levels of enclosed IF’s, with constructions like `A[i][j:k][l+1]`, low code reuse, global variables, and `from utils import *`.The student will spend time doing the bad implementation of 100K times implemented things, and that will not provide him with enlightenment on how they are implemented because of a lack of the theory.And nobody will teach him how to use standard tools on simple and understandable examples. It is boring, exhausting, and impractical. And in most cases, students can't do just part of tasks, because the auto checker will raise an error.

By Konstantinos K

Oct 5, 2020

I haven't had similar issues with previous courses by Deeplearning.ai, but with this one I was worried I'm overly stupid the moment I started, because I noted I was "missing" a lot and was not understanding easily what's going on (Note: I have all required background from the ML Course and DL Specialization). Then I saw the existing reviews and was happy to see I'm not alone to feel like that:

- Overly superficial coverage of theory in videos; too many things not explained well (if at all). For example: last week's videos are about... 18 minutes. REALLY? I thought we were talking more complex stuff here. If one can be taught this in 18 minutes, then... oh well...

- Lots of "copy-paste this here" parts in assignments, too (not much thinking/effort required).

- The quizzes are (as in most courses) a joke, they're there just for the sake of it; I just skip them.

- Looks to have been created in too much of a rush; I don't know if that's the case, but that's the feeling I get from the content quality...

Based on the success of the original Andrew Ng courses, the quality bar is high as are the expectations. I hope there is better quality control in future specializations, either in-house or by better selection of external beta-testers. I can't believe several reviewers bring this up, but no one else did before the release.

By Ravi S K

Oct 6, 2020

Tricky course, not well explained. I had to struggle a bit to understand the various concepts.

By Jeremy O C H

Oct 5, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

By Ryan B

Oct 6, 2020

To anyone looking to learn the content for the first time, I would suggest by reading the original papers and some blog posts. The videos are short and do not go in-depth much at all. The real meat for this course is in the homework assignments. The videos tend to oversimplify to the point of not explaining the concepts correctly or being flat out wrong and fail to give critical context to fully understand what is being explained. On the other hand, the homework was interesting (especially when compared to other courses out there) and did go in more depth, making students think through the details of some of the algorithms and models. tldr; learn the content elsewhere, take the course for the homework + to learn about trax.

By Eitan I

Oct 2, 2020

Great specialization, however the 4th course was not cooked enough. It is the most complicated material, sure, so this is the place to put extra effort in preparing the lectures and labs. Instead, I got the feeling you push much too much into 1 course. You should consider splitting it. I hope someone read this feedback...

By Raviteja R G

Oct 14, 2020

Explanation in video lectures is very shallow. Have to read research papers or blogs for better understanding. Lecture videos can be made much better.

By Han L

Oct 4, 2020

Started out nicely, but for Week3 and Week4 a lot of the concepts and details are skipped over or copy pasted.

By Akash

Sep 26, 2020

Outstanding Course. The course was rigorous

By Haoyu R

Oct 2, 2020

Not as details as enough. The quality of the course is very good at the start but decreases as the topics go deeper.

By Brooke M F

Nov 9, 2020

Token one star.

I was very disappointed in the overall low quality of this course. The labs were confusing (poor formatting, misleading comments), and even though I completed the assignments, I do not feel I obtained any solid grounding of the underlying concepts.

This course is easily the worst course I have taken on Coursera. Why the drop in quality?

By Fritsch V

Nov 27, 2020

Very disappointed by this course. I took the specialization to better understand Attention and these few videos are very unclear... I saw in the forum that my sentiment in shared by many people. Hope that Andrew will react and give us a better learning material.

By Paul J L I

Nov 3, 2020

This course glossed over everything and as a result I learned pretty much nothing. The constant congratulations for having done things, when I haven't done anything is aggravating.

By Muhammad M G

Dec 5, 2020

The videos need more explanation. Even the assignments were quite challenging because of 'trax'

By Ganesh s m

Oct 10, 2020

Every week's assignment brings a new challenge and it was fun to complete the assignments. Course Instructors explain concepts very well. This course teaches you from the beginner level to a professional level. Covers every topic related to NLP. I enjoyed learning NLP with Deeplearning.ai. I would like to thank deeplearning.ai for making this course.

By Huu M T H

Sep 30, 2020

Good course in overall. The last two weeks' assignment is a little bit too light. The instructor could introduce more about loading pretrained models and fine-tune them as it is a popular practice nowadays for small companies with limited resources (data/computation). Introduction to "easy-to-use" framework such as huggingface is highly recommended.

By Minh L L

Nov 18, 2020

Thank you Coursera and the DeepLearning.AI team. The moment I set foot on this journey I did not think I would love NLP so much. The course is very informative: it teaches NLP from the very first naive algorithm to the State-of-the-art models today.

By Bharathi k N

Oct 12, 2020

The course is so good and well presented. I really enjoyed the whole specialization. Thank you for this amazing course and the whole specialization which that me a lot. Thank you Andrew NG and deeplearning.ai team for this amazing specialization.

By Alan K F G

Oct 21, 2020

I learnt a lot about Transformers and Reformers which belong to the most advenced models for NLP tasks. The instructors were fully prepared though I'd prefer to see more animations in following courses. Thank you so much for spreading knowledge!

By Patrick A

Nov 26, 2020

An excellent course that covers research that was published about two months early.

It doen't get more cutting edge than that, and the technology (reversible residual layers) is immediately applicable and a very powerful enabler.

Thanks a lot!

By vadim m

Oct 17, 2020

An amazing level of breadth and depth of the material presented. State of the art techniques are exemplified via carefully crafted lab assignments with sufficient hints for students to be able to comprehend hard technical concepts.

By Simin F

Nov 27, 2020

Helpful and Interesting! This course leads me gradually understand how transformer works and being optimized along with several models without much confusions. Great thanks for the deeplearning.ai Team!!