Activation Functions in JAX
Learn about various activation functions used to train models in JAX.
Overview
Activation functions are applied in neural networks to ensure the network outputs the desired result. The activation function caps the output within a specific range. For instance, when solving a binary classification problem, the outcome should be a number between 0 and 1. This indicates the probability of an item belonging to either of the two classes.
Get hands-on with 1400+ tech skills courses.