Oct 4, 2020
Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks
Jun 22, 2021
This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.
By Audrey B•
Jan 4, 2022
Great content, although the focus is definitely more on the attention mechanisms and on the Transformer architecture than on the applications themselves. Still really enjoyed it and I now feel like I have a better grasp of Transfer Learning and its associated methods. Content is very clear and well explained
By Ankit K S•
Nov 30, 2020
This is really an interesting specialization with lots of things to learn in the domain of NLP ranging from basic to advanced concepts. It covers the state of the art Transformer architecture in great detail. The only thing with which I felt uncomfortable is the use of Trax Library in assignments.
By A V A•
Nov 20, 2020
Covers the state of the art in NLP! We get an overview and a basic understanding of designing and using attention models. Each week deserves to be a course in itself - could have actually designed a specialization on the attention based models so that we get to learn and understand better.
By Naman B•
Apr 28, 2021
It would have been better if we use standard frameworks like PyTorch instead of Trax. Also, the Course Videos are a bit confusing at times. It would have been great if the Math part would have been taught as Andrew Ng Taught in Deep Learning Course.
By Cees R•
Nov 29, 2020
Not new to NLP, I enjoyed this course and learned things I didn't know before. From an educational perspective, I didn't like that the two "optional" exercises were way harder than the too easy "fill in x here" assignment.
By Zicong M•
Dec 14, 2020
Overall good quality, but seems a bit short and content are squeezed.
I don't like the push of Trax neither, it is has yet become the mainstream and personally I don't find that helpful for my professional career.
By Gonzalo A M•
Jan 21, 2021
I think that we could go deeper in the last course because you taught a lot of complex concepts but I did not feel confidence to replicate them. It was better to explain transformers with more detail
By CLAUDIA R R•
Sep 7, 2021
It's a great course, more difficult than I thought but very well structured and explained. Although more didactic free videos can complete the lessons from others websites.
By Anand K•
Oct 15, 2020
great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate
By Moustafa S•
Oct 3, 2020
good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job
By Mohan N•
Mar 28, 2021
The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.
By RAHUL J•
Sep 29, 2020
Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!
By veera s•
Mar 18, 2022
need more detailed explanation in the last course of this specialization, especially Attention and BERT models.
By Vaseekaran V•
Sep 20, 2021
It's a really good course to learn and get introduced on the attention models in NLP.
By David M•
Oct 25, 2020
An amazing experience throughout the state-of-art NLP models
By Roger K•
May 17, 2022
Labs required a bit more context, to understand.
By Shaojuan L•
Dec 18, 2020
The programming assignment is too simple
By Fatih T•
Feb 4, 2021
great explanation of the topic I guess!
By Sreang R•
Dec 22, 2020
By Amit J•
Jan 2, 2021
Though the content is extremely good and cutting edge, the course presentation/instructor hasn't been able to do justice to the course.  Teaching concepts through assignments (and not covering them in detail in lectures is) an absolutely bad idea.  Lecture instructions are ambiguous and immature at times. Instructor is an excellent engineer but a bad teacher is very evident from the way of presentation.  Only if input output dimensions were mentioned at every boundary in network illustrations, would have made a lot of difference in terms of speed of understanding without having to hunt through off-line material and papers.  Using Trax I think is not a good idea for this course. The documentation is kind of non-existent and lot of details of functions are hidden and the only way to understand them is to look at the code. A more established framework like Tensorflow or pytorch would have been much more helpful.
Overall a disappointment given the quality of other courses available from Coursera.
By Laurence G•
Apr 11, 2021
Pros: Good choice of content coverage. Provides a historic overview of the field, covering the transition from early work on seq2seq with LSTMs, through the early forays into Attention, to the more modern models first introduced in Veswani et al. Week 4 covers the Reformer model which was quite exciting. Decent labs
Cons: Videos aren't great, there are a lot of better resources out there, many actually included in the course's reference section. Trax is not a good framework for learners in comparison to Pytorch, but if you plan on using TPUs and appreciate the pure functional style and stack semantics then it's worthwhile. The labs can be a bit copy-pasty. Some of the diagrams are awful - find other resources if this is a problem.
Overall: I'd probably rate this course a 3.5 but wouldn't round up. The videos really let things down for me, but I persisted because the lesson plan and labs were pretty good.
By Christine D•
Jan 22, 2021
Even though the theory is very interesting, and well explained the videos dive too deep in certain concepts without explaining the practical things you can do with them too very well.
The practical stuff, especially the graded assignments, are very centered around Trax, and the only things you have to know and understand are basic python and logic. You don't really get to make your own stuff, you just fill in stuff like "temperature=temperature" or "counter +=1".
I preferred and recommend the first two courses in this NLP-specialization.
By Azriel G•
Nov 20, 2020
The labs in the last two courses were Excellent. However the lecture videos were not very useful to learn the material. I think the course material deserves a v2 set of videos with more in depth intuitions and explanations, and details on attention and the many variants, etc. There is no need to oversimplify the video lectures, it should feel as similar level as the labs (assignments tend to be "too easy" but I understand why that is needed). Thanks for the courses. Azriel Goldschmidt
By Thomas H•
May 21, 2021
While the course succeeds in getting the most important points across, the quality of both the video lectures and the assignments is rather disappointing. The more detailed intricacies of attention and transformer models are explained poorly without providing any intuition on why these models are structured the way they are. Especially the lectures on current state-of-the-art models like BERT, GPT and T5 were all over the place and didn't explain these models well at all.
By Kota M•
Aug 23, 2021
This course perhaps gives a good overview of the BERT and several other extensions such as T5 and Reformer. I could learn the conceptual framework of the algorithms and understood what we can do with them. However, I think the instructors chose an undesirable mix of rigour and intuition. The lectures are mostly about intuition. In contrast, the assignments are very detailed and go through each logical step one by one.