Learner Reviews & Feedback for Generative AI Engineering and Fine-Tuning Transformers by IBM
About the Course
Top reviews
RK
Jan 16, 2025
The labs all too often failed on environment issues - packages, version alignment, etc. This should be seamless in your controlled environment.
SS
Nov 16, 2024
The coding part in the labs provided in this course was very helpful and helped me to stabilize my learning.
1 - 15 of 15 Reviews for Generative AI Engineering and Fine-Tuning Transformers
By Roger K
•Jan 17, 2025
The labs all too often failed on environment issues - packages, version alignment, etc. This should be seamless in your controlled environment.
By Alexandre E
•Jan 2, 2025
The course is good but lacks depth on complex subjects.
By Jens H
•Jan 25, 2025
In general I find the videos very hard to understand due to the mechanical reading of the texts and way too high tempo, and quite a big amount of grammatical errors subtract from the general readability.
By Sajjad
•Nov 17, 2024
The coding part in the labs provided in this course was very helpful and helped me to stabilize my learning.
By Mike S
•Oct 1, 2025
Interesting ways for LLM fine-tunning
By Geetika P
•Mar 26, 2025
One of the best courses for sure!!
By Eman H
•Jun 17, 2025
Thank you
By Uwiragiye B
•Dec 4, 2024
Awesome
By Deepa d
•Dec 1, 2025
GOOD
By Aurelio M
•May 8, 2026
I recently completed this course on LLM Fine-Tuning and was impressed by the breadth of topics covered. It strikes a great balance between theoretical foundations and the practical tools currently dominating the industry. What I liked: Modern Tech Stack: The course stays relevant by focusing on the Hugging Face Transformers library and PyTorch, which are the gold standard today. Comprehensive Roadmap: It covers everything from the "why" behind fine-tuning to advanced methodologies like Self-Supervised, Supervised (SFT), and RLHF (Reinforcement Learning from Human Feedback). Technical Variety: I appreciated the inclusion of diverse techniques. It covers Selective Fine-Tuning (dated for Transformers but great for context), Additive methods, and essential reparameterization techniques like LoRA and QLoRA, which are crucial in the current landscape. Niche Insights: A big plus for the section on Soft Prompting. It’s a subtle topic that many instructors overlook, yet it’s incredibly useful. Areas for Improvement: The practical component was the only downside. The labs felt a bit passive; it felt more like "reading through code" rather than actively building. I found the practical videos to be a bit too rushed, making it difficult to fully grasp the implementation details. Suggestion: The learning experience would be much more engaging with an interactive AI tutor guiding you step-by-step rather than just reading through notebooks.I recently completed this course on LLM Fine-Tuning and was impressed by the breadth of topics covered. It strikes a great balance between theoretical foundations and the practical tools currently dominating the industry. What I liked: Modern Tech Stack: The course stays relevant by focusing on the Hugging Face Transformers library and PyTorch, which are the gold standard today. Comprehensive Roadmap: It covers everything from the "why" behind fine-tuning to advanced methodologies like Self-Supervised, Supervised (SFT), and RLHF (Reinforcement Learning from Human Feedback). Technical Variety: I appreciated the inclusion of diverse techniques. It covers Selective Fine-Tuning (dated for Transformers but great for context), Additive methods, and essential reparameterization techniques like LoRA and QLoRA, which are crucial in the current landscape. Niche Insights: A big plus for the section on Soft Prompting. It’s a subtle topic that many instructors overlook, yet it’s incredibly useful. Areas for Improvement: The practical component was the only downside. The labs felt a bit passive; it felt more like "reading through code" rather than actively building. I found the practical videos to be a bit too rushed, making it difficult to fully grasp the implementation details. Suggestion: The learning experience would be much more engaging with an interactive AI tutor guiding you step-by-step rather than just reading through notebooks.
By FRANCISCO J M S
•Sep 30, 2025
La traducción automática no es precisa con ciertos términos.
By Óscar Z R
•Jan 25, 2026
There is a lot of importing errors in the jupyter notebooks
By john l
•Aug 17, 2025
The lab constantly ffails due to long loading of pip, performance issues
By Aitor C M d G
•Apr 27, 2026
More or less acceptable from the theoric point of view, absolutely terrible from the practical point of view. Pedagogically, these GenAI courses from IBM are an absolut disaster...
By Conor J C
•Aug 10, 2025
Very difficult to understand course videos. Far to much technical jargon which can easily confuse the listener. Found it very frustrating to complete this and irriated over-emphasis on technical specifics. Do not recommend!!!!