Metrics

Get familiar with the application of Jensen-Shannon divergence (JSD) and the Kolmogorov-Smirnov (KS) two-sample test in comparing real and GAN-generated samples.

We are going to use the Jensen-Shannon divergence and the Kolgomorov-Smirnov two-sample test for comparing real samples and samples generated with GANs. We are going to use the Kolgomorov-Smirnov two-sample test implementation found on scipy.stats and ks_2samp.

Jensen-Shannon divergence (JSD)

As we described in the chapter, “Introduction to Generative Models,” the Jensen-Shannon divergence (JSD) is a symmetric and smoothed version of the Kullback-Leibler (KL) divergence:

Get hands-on with 1200+ tech skills courses.