...
/Exercise: Randomized Grid Search to Tune XGBoost Hyperparameters
Exercise: Randomized Grid Search to Tune XGBoost Hyperparameters
Learn how to perform a randomized grid search to explore a large hyperparameter space in XGBoost.
We'll cover the following...
XGBoost for randomized grid search
In this exercise, we’ll use a randomized grid search to explore the space of six hyperparameters. A randomized grid search is a good option when you have many values of many hyperparameters you’d like to search over. We’ll look at six hyperparameters here. If, for example, there were five values for each of these that we’d like to test, we’d need searches. Even if each model fit only took a second, we’d still need several hours to exhaustively search all possible combinations. A randomized grid search can achieve satisfactory results by only searching a random sample of all these combinations. Here, we’ll show how to do this using scikit-learn and XGBoost.
The first step in a randomized grid search is to specify the range of values you’d like to sample from, for each hyperparameter. This can be done by either supplying a list of values, or a distribution object to sample from. In the case of discrete hyperparameters such as max_depth
...