Dec 5, 2020
I really liked the exposure to preparing various loss functions in paired and non-paired GANs, introduction to other applications, and many great changes to improve the quality of the networks!
Jan 23, 2021
GANs are awesome, solving many real-world problems. Especially unsupervised things are cool. Instructors are great and to the point regarding theoretical and practical aspects. Thankyou!
By Nguyễn S S•
Jul 25, 2021
By Raymond B S•
Feb 14, 2021
By Giang L T•
Feb 5, 2022
By Steven W•
Feb 26, 2021
I would have preferred the assignments spent more time on the training loop, and talking about what's going on with the cost function.
One of the interesting things about GANs is that your cost function is different for different parts of the network. This is really really important to the workings of a GAN, but we never touched the training loop after the first assignment in course 1. I feel like we should have spent more time nailing that training loop down.
Also, I don't think any of the classes mentioned the importance of the fact that the cost function is learned, rather than explicit. That's huge! You can do that for any network, not just generative networks, and it seems applicable to all kinds of less-supervised ML. It seems a waste that they didn't draw more attention to that.
By Ernest W•
Jan 8, 2022
Overall it was good but the final assignments were very confusing in my opinion because there are so many things going on there I still don't understand. I still think there is a lot to supplement, hours of exploration and reading many research papers to meet my expectations so I can create own generative art. Maybe more similar assignments with more detailed explanations (and more tasks) would make me understand more even at the cost of the specialization duration.
By Harold S•
Mar 6, 2021
It was good, I think it covered a lot of material and get you fast to a point where you can start attacking some real problems with this technology, however I do not fully like some of the exercises that get you stuck with some silly things.
By Stanislav K•
Jan 31, 2021
The course material is of very good quality. On the other hand, most of the coding exercises are limited to implementation of the loss functions. They are not teaching the students how to design the GAN architectures yourself.
By Rishab K•
Jun 22, 2021
Very good course, assignment could be made more longer than what is currently here. Should also include a project at the end to implement GAN
By Aditya S•
Oct 6, 2021
Great course by a great instructor and great team behind! Learned sooooo damn much. Can't wait to go out and apply some of this stuff!
By ARTEM B•
Mar 8, 2021
Not very well structured course. I think there is some room for improvements.
By Ibrahim G•
Nov 3, 2020
The assignments can go more in depth, but the content was great!
By Keebeom Y•
Nov 16, 2021
For English subtitles, there are many typos and sync of video and subtitles don’t match in some parts. Lecturer speaks too fast. But the content was very good, specifically coding projects.
By Mark P•
Nov 15, 2020
The programming assignments are too easy. Although the linked papers were useful I felt the optional notebooks should have been compulsory or we should have had to do more ourselves.
By Sameer R•
Oct 22, 2021
Too much repetition. More technical aspects could have been covered, given this is third course.
By Liang Y•
Mar 29, 2021
The Instructor did a great job on scripts and PPTs. However, Instead of teaching you GANs, she reads the scripts in a super fast speed. It is good that if you are reporting or interviewing since your audiences are professors or specialists who are already very familiar with GANs. But I think most of the audiences here know little about GANs. I prefer Andrew Ng's teaching style which guides the audiences and gives them time to think and learn.
By Farhad D•
Nov 15, 2020
Exercises were so bad. They are very easy, and they are ambiguous a little bit. It seems the creators got tired at the end and they did a bad job. However, I learned a lot and I am thankful, but It could be much better!