Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.3
stars
746 ratings
183 reviews

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

SB

Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

Filter by:

151 - 175 of 185 Reviews for Natural Language Processing with Attention Models

By Dan H

Apr 5, 2021

Pros: Good selection of state of the art models (as of 2020). Also great lab exercises.

Cons: The video lectures and readings are not very helpful. Explanations about the more tricky parts of the models and training processes are vague and ambiguous (and some times kind of wrong?). You can find more detailed and easier to understand lectures on Youtube.

By dmin d

Jan 7, 2021

Have to say, the instructor didn't explain the concept well. A lot of explanation doesn't make sense, or just give the final logic and skip all the details. I need to search on youtube or google to understand the details and concept.

But, it covers state-of-art models for NLP. It's a good starting point and helped save time.

By Oleksandr P

Apr 4, 2021

Although this course gives you understanding about the cutting edge NLP models it lacks details. It is hard to understand a structure of the complex NLP model during the few minute video. This course should have step by step explanations in the bigger number of lectures or increase their duration.

By Nunzio V

Apr 7, 2021

Nice course. Full of very interesting infomation. What a pity not having used Tensorflow. All that knowledge is unfortunately not work-ready as Trax is not widespreadly used in the industry world and it is hardlyit will ever be. In my opinion.

By Семин А С

Aug 9, 2021

Explanation of Attention models with Attention mechanism itself and other building blocks of the Transformers was very confusing. It was really hard sometime to udnerstand what the lecturer really meant.

By Michel M

Feb 9, 2021

The presented concepts are quite complex - I would prefer less details as most will not understand them anyway and more conceptual information why these models are build as they are

By Zeev K

Oct 24, 2021

not clear enough. the exersices warent good enough' i didn't learned from them much. it could be a great idea to give the slides at the end of every week for reapet.

By Damian S

Feb 24, 2022

C​ourse content is fantanstic, but assignments are ridiculous--they test how well you can read directions, but not how well you understand the content.

By Huang J

Dec 23, 2020

Course videos are too short to convey the ideas behind the methodology. Illustration is too rough.

By Maury S

Mar 13, 2021

Another less than impressive effort in a specialization from which I expected more.

By martin k

Apr 26, 2021

Low quality programming assignments, but considering the price it's good overall

By Prithviraj J

Dec 21, 2020

Explanations of attention/self-attention & other complex topics are too shallow

By Anurag S

Jan 3, 2021

Course content more detailed explanation to follow.

By Yue W G

May 24, 2021

The content is good because it covers many aspects of NLP. There are a lot of illustrations provided to help students understand the materials. However, the assignments are too easy because of the detailed comments provided. This makes it too easy because students could simply copy and paste the answers from the comments.

One suggestion is to improve the explanation of the materials because there re lots of details being skipped by the instructors. Personally, I would have to read other blogs in order to understand some of the details. Furthermore, separating the solutions from the codes is definitely something that must be done for instance presenting the solution in a separate notebook.

By Randall K

Jun 14, 2021

In the previous 3 courses, the HW was a natural extention of the lectures and provided solid reinforcment of the course material. However, in this course, I found the courses did not prepare me for the HW. Furthermore, I found the lectures too terse, often incoherent, and the homework tried to introduce new concepts that were not discussed in the lectures. Also, the code in the labs was poorly organized and the lack of a consistent and coherent style between assignments and even previous courses, which made it difficult to follow the logic. I often spent a lot of time sorting out tensor indexing issues, which is very difficult in Jupyter without a debugger.

By Chenjie Y

Nov 18, 2020

I think the last course is a bit rush... Many concepts are not natural and cannot be explained by one or two sentences. Comparing to the previous courses in the specialisation which really explains concepts and intuitions in detail, this last course is a bit too rough. I would rather spend another month to study the materials in two courses, instead of staying up late to read papers and blogs to understand what was not explained clearly in the course. And also, i see that trax is a good library but i think up to now it is not yet mature, and i really wish all the assignments can have tensorflow versions and let the students to choose.

By Vitalii S

Jan 25, 2021

1) Information 3 out of 5:

no in depth explanations.

2) quiz are too easy, and I was missing good quizzes that were proposed at DL specialization with use cases, they cause me to think what to pick.

3) home tasks are 1 out of 5:

3.1 First of all all home tasks are done in different manner.

3.2 Some of them require additional check even all tests were passed.

3.3 Part with google collab is also a little bit strange... I want to have 1 click away home task and not setting up 3-rd party env.

What is good: for high - level overview this course is ok. Maybe have 2 versions of the course one with in depth explanations. and one more like this one.

By Greg D

Dec 31, 2020

Even though this is better than the other 3 courses in the specialization it's not really any different from reading a few posts on popular machine learning blogs about the technologies they present here. I would understand if the instructors brought some insights, but it's largely just repeating what they have in the slide which in turn is just bare minimum about how to make these concepts work (which again can be found through papers + free resources).

Overall, I would recommend against taking this course since there are better or equal materials available.

By DAVIDE M

Mar 9, 2022

This course is good if you want to be theoretically good with Transformers model. I mean now I can explain those concepts to my colleagues or pair. It lacks with the practical parts, a lot of exercises are too guided e there is no project that you can show off. The hugginface part is the most interesting for practicing but there are only a few lessons. In the end, do not expect to make a chatbot in week four, it is "just" a model that generates dialogue between two persons.

By Lucky S

Feb 24, 2022

This Course is the weakest course of this Specialization.

Course 1 - 3 was very strong and solid. But Course 4 feels very rushed. The Curriculum is very hard to follow, let alone to understand. The Lab wasn't commented enough to give us proper explanation (Especially week 4). There are a lot of concept that isn't explained at great length when it should.

By Arun

Feb 18, 2021

Compared to Andrew Ng's deep learning specialization, this course requires a lot of improvement. Very often disparate facts are put together with not much connection between the ideas. This is probably because of the enormous amount of content covered. It might make sense to split the course into two. Thank you!

By George G

Dec 6, 2020

Week 1 jumps into material that is better explained in Week 2. Attention deserves a more gradual and a more deep explanation. Weeks 3 and 4 cover a lot of ground, without going into depth.

By Steven N

Apr 29, 2021

The course lectures were very confusing, and the course assignments were too easy, so they didn't reinforce the lecture concepts in the same way that assignments from other courses had.

By Gary L

Oct 20, 2020

Disappointed. Course 4 is much more difficult to follow than other courses in this NLP specification plus other deeplearning.ai course.