WC
Excellent and concise presentation of Transformer and BERT models. The course designer may consider adding programming assignments to illustrate the concepts and to reinforce student learning.

This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model. You learn about the main components of the Transformer architecture, such as the self-attention mechanism, and how it is used to build the BERT model. You also learn about the different tasks that BERT can be used for, such as text classification, question answering, and natural language inference. This course is estimated to take approximately 45 minutes to complete.

WC
Excellent and concise presentation of Transformer and BERT models. The course designer may consider adding programming assignments to illustrate the concepts and to reinforce student learning.
NK
I am looking for free resources for experimentation, or a lighter model that can run on my laptop.
NA
course was amazing gave me a good overview of BERT model and concepts like Encoding and decoding but not for beginner :>
JS
Concise, challenging, thought-provoking. This course is an immersive look into the inner workings of Transformer models, and the BERT model
NS
very clear and detailed explanation of the transformers with practical example of training BERT model
KL
I need to use my on GCP to run the lab. Otherwise, very good introduction to get going on Transformers
KB
Lab no longer works end to end. I was able to run until we started building classification model. TesnsorFlow code is no longer compiling.
RR
it is a short but very effective video. the content is crisp and easy to understand if you have decent understanding of NN.
Showing: 20 of 27
This course should never have been published to coursera with such a state, it should have been a free YouTube video at best. What can be done better: - Make it a full course, i.e., start by introducing sequence modelling, why use attention mechanism over something like LSTM etc. - Start introducing concepts such as tokenization, vectorization, etc. - Describe transformer model in detail, take a simple 1 encoder / 1 decoder block and explain how it works with a toy example dataset. - Teach implementation of the toy example in code. - Introduce BERT with clear goal on what it aims to solve - Then comes the part that was discussed in the current course.
The videos do not properly explain how to setup Google Cloud account and help with running the lab. Since its just a couple of videos, I expected some solid lab work- but there are no efforts made by Google to help learners setup their lab environment and get the stuff running.
The introduction is too quick and shallow, with no other material offered to make up for it. Also, it forces you to use Google Cloud.
The course needs to cover the 'why' and 'how' in addition to 'what' aspects of the topics. The content lacks depth of explanation making it less useful than just getting familiarity with a bunch of technical terms. In fact, the AI coach in coursera provided excellent explanations of complex topics such as encoder, decoder, attention, etc. which the training video itself fails to provide.
Explanation on the Transformer Model is too short. I expected a detailed one on the Transformer Model Architecture. And It is not easy to follow up the notebook. I am still struggling to configure the notebook.
course was amazing gave me a good overview of BERT model and concepts like Encoding and decoding but not for beginner :>
I'm none the wiser.
The course's lab was highly interactive and effectively reinforced the material by providing hands-on experience with real-life datasets. The quiz was well-designed to test understanding and retention, making sure that key concepts were grasped thoroughly.
Excelente curso. La explicación fue clara, práctica y bien estructurada. El instructor domina el tema y logra que conceptos complejos se entiendan fácilmente. Lo recomiendo totalmente para quienes deseen profundizar y aplicar lo aprendido de inmediato.
Concise, challenging, thought-provoking. This course is an immersive look into the inner workings of Transformer models, and the BERT model
it is a short but very effective video. the content is crisp and easy to understand if you have decent understanding of NN.
very clear and detailed explanation of the transformers with practical example of training BERT model
Nice course
good
Excellent and concise presentation of Transformer and BERT models. The course designer may consider adding programming assignments to illustrate the concepts and to reinforce student learning.
Lab no longer works end to end. I was able to run until we started building classification model. TesnsorFlow code is no longer compiling.
I need to use my on GCP to run the lab. Otherwise, very good introduction to get going on Transformers
I am looking for free resources for experimentation, or a lighter model that can run on my laptop.
IS this all?
The explanation isn't all that useful. It distilled the basics of BERT together into a very short video, which does mention many details but not in the detail required to understand those based on the provided explanation. Either the course should stick to a higher level info for a short video, or go more into detail and take the necessary time to explain the concepts.