Prepare Data: Random Initial Weights
Learn how to initialize the weights randomly, and to address the problems in initial weight selection.
We'll cover the following...
Random initiazation of weights
The same argument applies here as with the inputs and outputs. We should avoid large initial weights because they cause large signals into an activation function, leading to the saturation we just talked about, and the reduced ability to learn better weights.
We could choose initial weights randomly and uniformly from a range of to . That would be a much better idea than using a very large range, say ...
Access this course and 1400+ top-rated courses and projects.