Amazon SageMaker is a fully managed machine learning service that allows developers to train, validate, and deploy models quickly. It also provides native support for specific frameworks like PySpark, TensorFlow, and PyTorch. Amazon SageMaker provides built-in algorithms and pretrained models for various use cases, such as text summarization, text generation, image classification, feature engineering, and more.
In Amazon SageMaker, automatic hyperparameter tuning simplifies the process by exploring different hyperparameter combinations, often using techniques like Bayesian optimization. This helps find the best parameters for improved accuracy, faster training, and reduced overfitting, making model development easier and more efficient.
In this Cloud Lab, you’ll create an IAM role used by SageMaker Notebook to perform specific actions, an S3 bucket to store training and output data, and the notebook itself. You’ll install the required libraries and configure hyperparameter tuning and training job settings. Finally, you’ll launch the hyperparameter tuning job and monitor the training jobs to find the optimized one.
After completing this Cloud Lab, you’ll understand configuring hyperparameter tuning jobs for SageMaker’s built-in algorithms.