Tune Hyperparameters
Explore tuning of hyperparameters - picking up the right number of epochs and hidden nodes.
We'll cover the following...
We built a neural network and prepared its input data, but that was only the beginning. The same algorithm running on the same data can yield wildly different results, depending on hyperparameters such as the learning rate and the number of hidden nodes.
ML development is mostly about finding good values for those hyperparameters. Compared to software development, there is no hard-and-fast rule that tells us how to set those hyperparameters.
We should not change multiple hyperparameters at the same time. Otherwise, we would not know which changes affected the network’s accuracy. Instead, let’s tune those hyperparameters one at a time, starting with the easy one.
Picking the number of epochs
The easiest hyperparameter to tune is epochs
arguably. We already know that
the longer we train a system, the more accurate it becomes—up to a point.
After a ...