...

/

Introduction to Model Optimization

Introduction to Model Optimization

This lesson will introduce the idea of model optimization.

Why is optimization hard?

The optimal value of one weight depends on the value of the other weights. Many weights are optimized at once. From the slope, we will understand which weights to increase or decrease. However, the updates may not improve the model meaningfully. This calls for changing the hyperparameters.

Hyperparameters: Optimization methods

There are various ways to optimize the hyperparameters using trial and error. However, there is not a consensus on what works best.

Hyperparameters related to a neural network structure

1. Model complexity

This defines hidden layers or nodes per layer. The increase in model complexity usually enhances accuracy.

πŸ“ Learn more about model complexity.

...

Applying dropout after the 2nd hidden layer
Access this course and 1400+ top-rated courses and projects.