RR
I loved this course. It is very informative and has a lot of examples. It will take some time to master all this information.
This course provides a practical introduction to using transformer-based models for natural language processing (NLP) applications. You will learn to build and train models for text classification using encoder-based architectures like Bidirectional Encoder Representations from Transformers (BERT), and explore core concepts such as positional encoding, word embeddings, and attention mechanisms.
The course covers multi-head attention, self-attention, and causal language modeling with GPT for tasks like text generation and translation. You will gain hands-on experience implementing transformer models in PyTorch, including pretraining strategies such as masked language modeling (MLM) and next sentence prediction (NSP). Through guided labs, you’ll apply encoder and decoder models to real-world scenarios. This course is designed for learners interested in generative AI engineering and requires prior knowledge of Python, PyTorch, and machine learning. Enroll now to build your skills in NLP with transformers!
RR
I loved this course. It is very informative and has a lot of examples. It will take some time to master all this information.
AB
This course gives me a wide picture of what transformers can be.
VB
need assistance from humans, which seems lacking though a coach can give guidance but not to the extent of human touch.
RR
Once again, great content and not that great documentation (printable cheatsheets, no slides, etc). Documentation is essential to review a course content in the future. Alas!
PA
Excellent course to understand about AI/ML/GenAI. The videos are not very detailed and just the right amount to skim through the details.
MA
Exceptional course and all the labs are industry related
Showing: 20 of 31
The narration is poor. Instead of an expert lecturer, a narrator reads the text without understanding its meaning. Many fundamental terms are left unexplained.
It is an excellent specialisation, except the pace of the speaker is very fast. It is difficult to understand, and it sounds very artificial.
Fantastic class but it takes WAY MORE TIME than what is reported, unless you just don't do the labs or casually read them high level. Going in-depth in the labs and doing the necessary work to understand all key concepts, and codes, will take you easily 3-4x more times depending on your current level of expertise. Example: a lab of 30 minutes has a length of 15 A4 pages when you print it. Now imagine all these pages contain key notions & codes. Superb class, but required time is highly underestimated (like most of the IBM Generative AI Engineering certification).
Once again, great content and not that great documentation (printable cheatsheets, no slides, etc). Documentation is essential to review a course content in the future. Alas!
Good content but I truly cannot understand...
need assistance from humans, which seems lacking though a coach can give guidance but not to the extent of human touch.
The course is interesting and challenging. The lab assignments should be divided into more parts. There's too much code to grasp in a single lab session, making it difficult to follow the task. A major drawback is the extremely long training time of the model in the lab work. For example, BERT took over an hour to train. During that time, it's easy to lose interest in continuing the course. Either the model needs to be simplified to train faster, or the performance of the environment running the Jupyter Notebook should be significantly improved.
The course is too machinery and jumps directly into deep topics without smooth introductions of the background or concepts. It is hard to follow the sequence of ideas.
Too much detail squeezed in a short time
Found this a very difficult course to understand. I do not recommend this at all, if you are not already a highly experienced professional in the field. Very badly explained, and impossible to keep up with. Way to much emphasize on technical lingo that to the untrained hear goes right over someones head. Continously had to google terms but did not help in making sense of it. Found very off putting in continuing course. Do not recommend at all.
It's one of the worst courses I've seen. I couldn't understand anything from their explanation and I had to resort to external resources to understand the topic (and I am already someone with ML background).
This course is soooo boring. It feels like it's written by robots for robots. I want to see humans teaching material and making it understandable, interesting, and relatable. This is just ai-slop.
shit
Great course! Clear explanations, solid structure, and just the right mix of theory and hands-on content. Thanks to Dr. Joseph Santarcangelo, Fateme Akbari, and Kang Wang for making complex concepts so accessible. Really enjoyed it and learned a lot about transformers and GenAI!
Contenido útil, completo y bien organizado, con coach virtual ,material y laboratorios de gran ayuda para la comprensión de los modelos
I loved this course. It is very informative and has a lot of examples. It will take some time to master all this information.
Excelente curso IBM, una buena oportunidad para aprener sobre la IA
Get more familiar with transformer and its application in language
This course gives me a wide picture of what transformers can be.
Exceptional course and all the labs are industry related