Custom Activation Functions

Learn how to create and implement custom activation functions.

Custom activation

Activations is an active research field in deep learning and is still at its nascent stage. It’s common among researchers to attempt novel activation ideas. To enable this, custom activation implementation is shown here. Activations can be defined as a conventional Python function. Their gradient should also be defined and registered to TensorFlow for such definitions.

However, the gradient definition is usually not required if the activation is defined using TensorFlow functions. TensorFlow has derivatives predefined for its built-in functions. Therefore, explicit gradient declaration is not required. Therefore, this approach is simpler and is practically applicable in most activation definitions.

Thresholded exponential linear unit

Here, a custom activation, thresholded exponential linear unit (telu), is defined in the equation below.

g(x)={ ...