Improving GAN Training
Learn how we can improve GAN training and why these changes lead to model performance.
We'll cover the following
Here we’ll try to fix the mode collapse and image clarity problems by trying to improve the training quality in our GAN. We’ve already seen some ideas for doing this when we developed refinements to our MNIST classifier.
Changing the loss function to BCELoss
The first refinement is to use the binary cross entropy BCELoss()
instead of the mean squared error MSELoss()
for the loss function.
📝 We’ve already talked about how binary cross entropy loss makes more sense when our network is performing a classification task. It also punishes incorrect answers, and rewards correct ones, more strongly than the
MSELoss()
.
Get hands-on with 1400+ tech skills courses.