Weights: The Heart of the Network
Learn how to define link weights in a neural network.
We'll cover the following
Weights
Let’s create the network of nodes and links. The most important part of the network is the link weights. They’re used to calculate the signal being fed forward, the error as it’s propagated backward, and the link weights themselves are refined in an attempt to improve the network.
The weights can be concisely expressed as a matrix. So, we can create:
-
A matrix for the weights for links between the input and hidden layers: , of size .
-
A matrix for the links between the hidden and output layers: , of size .
Remember, earlier the convention was to see why the first matrix is set up as , and not the other way around .
Remember that the initial values of the link weights should be small and random. The following numpy
function generates an array of values selected randomly between and , where the size is (rows
columns
).
Get hands-on with 1200+ tech skills courses.