A Regularization Toolbox

Discover the benefits of regularization techniques.

Combat overfitting through regularization

Just like tuning hyperparameters, reducing overfitting is more art than science. Besides L1 and L2, we can use many other regularization methods. An overview of some of the techniques is given below:

Small network size: The most fundamental regularization technique is to make the overfitting network smaller. It is also the most efficient technique. After all, overfitting happens because the system is too smart for the data it’s learning. Smaller networks are not as smart as big networks. We should try to reduce the number of hidden nodes or remove a few layers. We’ll use this approach in the chapter’s closing exercise.

Reduce input variables: Instead of simplifying the model, we can also reduce overfitting by simplifying the data. We can remove a few ...