There are two essential steps to optimize a neural network. These steps are as follows:
Forward propagation: This is a technique used to find the actual output of neural networks. In this step, the input is fed to the network in a forward direction. It helps us find the actual output of each neuron.
Backpropagation: In this step, we update the weights of the network based on the difference between the actual output of the network and the ground truth.
In this answer, we'll focus more on forward propagation.
To learn about backpropagation click here.
There are two main steps in forward propagation—calculating the sum of the product and using an activation function.
For each neuron, we multiply the inputs
In the following illustration, we depict the computation process above for a neuron over three inputs and a bias.
Then sum of products
The activation function converts the neuron’s output into a particular output range and introduces non-linearity. It converts it in the range of
To learn more about activation functions click here.
The following illustration shows the forward propagation of a neural network with an input layer, a hidden layer consisting of two neurons, and an output layer:
We are assuming a linear activation function in the example above. A linear function gives the same output as its input:
Free Resources