Introduction to the Sequential Model-Based Optimization Method

Learn about the optimization of hyperparameters using the sequential model-based optimization method.

What is the sequential model-based optimization method?

Sequential model-based optimization (SMBO) is a powerful method for hyperparameter tuning in ML algorithms. It uses a probabilistic model (surrogate model) to predict the performance of different combinations of hyperparameters based on previously evaluated configurations or combinations. The probabilistic model is then used to suggest the next combination of hyperparameters to evaluate the performance of the ML model and determine whether it produces better results or not.

Note: This process is called sequential because each new combination of hyperparameters is selected based on the results of the previous evaluation.


The goal of this method is to cut down on the number of evaluations needed to find the best solution or the best combination of hyperparameters that produces the best results. For example, instead of doing 2020 evaluations to find the best combination of hyperparameters to get the best results, this method might only require 1010 evaluations or less to find the best combination of hyperparameters. This can save a significant amount of time and the resources necessary for computation.

At a high level, Bayesian optimization methods are effective due to the fact that they select the next combination of hyperparameters in an informed manner. By using past evaluations and updating a probabilistic model at each iteration, the algorithm selects combinations of hyperparameters that have the highest probability of producing good results.

Get hands-on with 1200+ tech skills courses.