Divergence Measures
Learn how we can calculate divergence measures in JAX.
Here we’ll focus on a quite important aspect of statistical learning. This lesson is advanced and can be reasonably skipped if needed.
Introduction
We can easily compare a couple of scalar values by their difference or a ratio. Similarly, we can compare the two vectors by taking the L1 or L2 norm.
To extend this notion of divergence between a couple of distributions requires some better measures, though. There are several real-world applications where we need to find the similarity (or difference) between two distributions. For example, text comparison between two sequences in bioinformatics, text comparison in Natural Language Processing (NLP), comparison of generated images by Generative Adversarial Networks (GANs), and so on.
Entropy
Let’s begin with the fundamental measure. The entropy of an independent vector is defined as:
Usually, the base of the log is taken as either or .
Relative entropy
The relative entropy between two vectors and is defined as:
Since the equation involves (element-wise) ratio as well as logarithm, we must make sure to check for the zeros.
Create a free account to view this lesson.
By signing up, you agree to Educative's Terms of Service and Privacy Policy