Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
972 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

SB

Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

Filter by:

201 - 225 of 238 Reviews for Natural Language Processing with Attention Models

By Mohan N

Mar 28, 2021

The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.

By Rahul J

Sep 29, 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

By veera s

Mar 18, 2022

need more detailed explanation in the last course of this specialization, especially Attention and BERT models.

By Vaseekaran V

Sep 20, 2021

It's a really good course to learn and get introduced on the attention models in NLP.

By David M

Oct 25, 2020

An amazing experience throughout the state-of-art NLP models

By Roger K

May 17, 2022

Labs required a bit more context, to understand.

By Shaojuan L

Dec 18, 2020

The programming assignment is too simple

By Fatih T

Feb 4, 2021

great explanation of the topic I guess!

By Sreang R

Dec 22, 2020

Awesome course

By Erik S

Dec 23, 2023

Overall I found this course underwhelming, especially in comparison to the other three courses in this specialization. The lecture videos don't seem to flow coherently, lots of terminology is introduced without being defined, there seems to be hand-waving over details, and it feels a bit infantilizing when videos ends with statements like "you now understand the T5 architecture" or "you now know how to fine-tune transformer models" when those concepts are not really explained in any meaningful detail in the videos. There also seems to be a big gap between how complex/advanced these topics are and how trivially easy the programming assignments are, with most of the logic implemented for us; completing most exercises doesn't require much more than reading comments to replace "None" values or copying code from preceding cells. In previous courses in this specialization the assignments felt like truer assessments of what we've learned. I hope this course gets a refresh for future students!

By Azriel G

Nov 20, 2020

The labs in the last two courses were Excellent. However the lecture videos were not very useful to learn the material. I think the course material deserves a v2 set of videos with more in depth intuitions and explanations, and details on attention and the many variants, etc. There is no need to oversimplify the video lectures, it should feel as similar level as the labs (assignments tend to be "too easy" but I understand why that is needed). Thanks for the courses. Azriel Goldschmidt

By Kota M

Aug 23, 2021

This course perhaps gives a good overview of the BERT and several other extensions such as T5 and Reformer. I could learn the conceptual framework of the algorithms and understood what we can do with them. However, I think the instructors chose an undesirable mix of rigour and intuition. The lectures are mostly about intuition. In contrast, the assignments are very detailed and go through each logical step one by one.

By Nunzio V

Apr 7, 2021

Nice course. Full of very interesting infomation. What a pity not having used Tensorflow. All that knowledge is unfortunately not work-ready as Trax is not widespreadly used in the industry world and it is hardlyit will ever be. In my opinion.

By Artem A

Aug 9, 2021

Explanation of Attention models with Attention mechanism itself and other building blocks of the Transformers was very confusing. It was really hard sometime to udnerstand what the lecturer really meant.

By Michel M

Feb 9, 2021

The presented concepts are quite complex - I would prefer less details as most will not understand them anyway and more conceptual information why these models are build as they are

By Zeev K

Oct 24, 2021

not clear enough. the exersices warent good enough' i didn't learned from them much. it could be a great idea to give the slides at the end of every week for reapet.

By Huang J

Dec 23, 2020

Course videos are too short to convey the ideas behind the methodology. Illustration is too rough.

By Maury S

Mar 13, 2021

Another less than impressive effort in a specialization from which I expected more.

By Prithviraj J

Dec 21, 2020

Explanations of attention/self-attention & other complex topics are too shallow

By Anurag S

Jan 3, 2021

Course content more detailed explanation to follow.

By ABHISHEK T

Apr 24, 2023

elaborate and make it easy to learn

By Przem G

Feb 18, 2023

I would not understand much if I haven't known most of the material beforehand. Lots of repetition (not bad, just boring), but worse, bugs as well. Many times the lecturer doesn't know what he's talking about, and messes things up. Characteristic moment is when all of a sudden he talks about things without definition (like "shared frame", "adapter", or shows a diagram contradicting the code besides, etc.), or changes subject abruptly.

The grader is terrible crap happily returning errors but no explanation. You teach AI, you talk about LMs beating humans, yet the tool used for evaluating your students is so primitive as if written two decades ago. It's very likely that it infuriates everybody except its proud author. Either the code to fill is trivial (we learn nothing), or it requires mental work which potentially leaves some traces. The effect is that code works fine, but the grader fails miserably.

Like many of your courses, this one too teaches us more about the author's horizon and expectations, than new knowledge we pay for. This is particularly evident during quizzes where poorly formulated questions, answerable only in narrow context, abound. Also bugs like "translating french to english" require to mark "keys and values are the french words"...

By Yue W G

May 24, 2021

The content is good because it covers many aspects of NLP. There are a lot of illustrations provided to help students understand the materials. However, the assignments are too easy because of the detailed comments provided. This makes it too easy because students could simply copy and paste the answers from the comments.

One suggestion is to improve the explanation of the materials because there re lots of details being skipped by the instructors. Personally, I would have to read other blogs in order to understand some of the details. Furthermore, separating the solutions from the codes is definitely something that must be done for instance presenting the solution in a separate notebook.

By Vitalii S

Jan 25, 2021

1) Information 3 out of 5:

no in depth explanations.

2) quiz are too easy, and I was missing good quizzes that were proposed at DL specialization with use cases, they cause me to think what to pick.

3) home tasks are 1 out of 5:

3.1 First of all all home tasks are done in different manner.

3.2 Some of them require additional check even all tests were passed.

3.3 Part with google collab is also a little bit strange... I want to have 1 click away home task and not setting up 3-rd party env.

What is good: for high - level overview this course is ok. Maybe have 2 versions of the course one with in depth explanations. and one more like this one.

By Greg D

Dec 31, 2020

Even though this is better than the other 3 courses in the specialization it's not really any different from reading a few posts on popular machine learning blogs about the technologies they present here. I would understand if the instructors brought some insights, but it's largely just repeating what they have in the slide which in turn is just bare minimum about how to make these concepts work (which again can be found through papers + free resources).

Overall, I would recommend against taking this course since there are better or equal materials available.