SMBO Using Histogram-Based Gradient Boosting

Learn how to apply the sequential model-based optimization method (SMBO) to find the best hyperparameters for a histogram-based gradient boosting model.

In this example, we’ll use the histogram-based gradient boosting algorithm to determine which combination of hyperparameter values will produce the best results compared to the results obtained by using the default values for the hyperparameters.

What will we learn?

In this lesson, we’ll learn how to do the following things in the Jupyter Notebook:

  • Create and train the histogram-based gradient boosting algorithm.

  • Measure the performance of the ML model.

  • Implement the SMBO method.

  • Identify the combination of hyperparameters that provide the best results.

Import important packages

First, we import the important Python packages that will do the following tasks:

  • Create and train the histogram-based gradient boosting algorithm using scikit-learn.

  • Check ML model performance using the F1 score from scikit-learn.

  • Implement the SMBO method using scikit-optimize.

  • Identify a combination of hyperparameters that provide the best results using attributes from scikit-optimize.

Get hands-on with 1200+ tech skills courses.