RR
I loved this course. It is very informative and has a lot of examples. It will take some time to master all this information.
This course provides a practical introduction to using transformer-based models for natural language processing (NLP) applications. You will learn to build and train models for text classification using encoder-based architectures like Bidirectional Encoder Representations from Transformers (BERT), and explore core concepts such as positional encoding, word embeddings, and attention mechanisms.
The course covers multi-head attention, self-attention, and causal language modeling with GPT for tasks like text generation and translation. You will gain hands-on experience implementing transformer models in PyTorch, including pretraining strategies such as masked language modeling (MLM) and next sentence prediction (NSP). Through guided labs, you’ll apply encoder and decoder models to real-world scenarios. This course is designed for learners interested in generative AI engineering and requires prior knowledge of Python, PyTorch, and machine learning. Enroll now to build your skills in NLP with transformers!
RR
I loved this course. It is very informative and has a lot of examples. It will take some time to master all this information.
RR
Once again, great content and not that great documentation (printable cheatsheets, no slides, etc). Documentation is essential to review a course content in the future. Alas!
AB
This course gives me a wide picture of what transformers can be.
PA
Excellent course to understand about AI/ML/GenAI. The videos are not very detailed and just the right amount to skim through the details.
VB
need assistance from humans, which seems lacking though a coach can give guidance but not to the extent of human touch.
MA
Exceptional course and all the labs are industry related
Showing: 20 of 31
The narration is poor. Instead of an expert lecturer, a narrator reads the text without understanding its meaning. Many fundamental terms are left unexplained.
It is an excellent specialisation, except the pace of the speaker is very fast. It is difficult to understand, and it sounds very artificial.
Fantastic class but it takes WAY MORE TIME than what is reported, unless you just don't do the labs or casually read them high level. Going in-depth in the labs and doing the necessary work to understand all key concepts, and codes, will take you easily 3-4x more times depending on your current level of expertise. Example: a lab of 30 minutes has a length of 15 A4 pages when you print it. Now imagine all these pages contain key notions & codes. Superb class, but required time is highly underestimated (like most of the IBM Generative AI Engineering certification).
Once again, great content and not that great documentation (printable cheatsheets, no slides, etc). Documentation is essential to review a course content in the future. Alas!
Good content but I truly cannot understand...
need assistance from humans, which seems lacking though a coach can give guidance but not to the extent of human touch.
Great course! Clear explanations, solid structure, and just the right mix of theory and hands-on content. Thanks to Dr. Joseph Santarcangelo, Fateme Akbari, and Kang Wang for making complex concepts so accessible. Really enjoyed it and learned a lot about transformers and GenAI!
Contenido útil, completo y bien organizado, con coach virtual ,material y laboratorios de gran ayuda para la comprensión de los modelos
I loved this course. It is very informative and has a lot of examples. It will take some time to master all this information.
Excelente curso IBM, una buena oportunidad para aprener sobre la IA
Get more familiar with transformer and its application in language
This course gives me a wide picture of what transformers can be.
Exceptional course and all the labs are industry related
honestly i learnt a lot
its good hope its free
Great Course! Thanks
Good Explanation.
Excellent content
Nice Course
good.