Cross Entropy

Learn about cross-entropy loss in detail and play with its code.

What is the cross-entropy loss?

So far, we have used the log loss formula for our binary classifiers. We even used the log loss when we bundled ten binary classifiers in a multiclass classifier (in The Final Challenge). In that case, we added together with the losses of the ten classifiers to get a total loss.

While the log loss served us well so far, it’s time to switch to a simpler formula, one that’s specific to multiclass classifiers. It’s called the cross-entropy lossIt measures the distance between the classifier’s predictions and the labels. (The lower the loss, the better the classifier.), and it looks like this:

...

Access this course and 1400+ top-rated courses and projects.