Prepare Data: Random Initial Weights
Understand the importance of random initial weights in neural networks and how to set them properly to prevent saturation and bias. Learn why zero or large weights hinder learning and discover practical rules for choosing initial weights based on network links for better training outcomes.
We'll cover the following...
We'll cover the following...
Random initiazation of weights
The same argument applies here as with the inputs and outputs. We should avoid large initial weights because they cause large signals into an activation function, leading to the saturation we just talked about, and the reduced ability to learn better weights.
We could choose initial weights randomly and uniformly from a range of to . That would be a much better idea than using a very large range, say ...