Neural networks are computing systems inspired by the biological neural networks that make up the brain. A neural network works by taking in a data set and making a prediction.
There are three layers in a neural network:
Input layer: The input is fed into the input layer, and each node in the layer provides an output. These outputs serve as the input of the hidden layer.
Hidden layer: This layer performs computation and provides weights to the input. The output of this layer is given as input to the output layer.
Output layer: This layer is the final layer of the neural network where desired predictions are obtained. The output of this layer is provided to the outside world.
PyTorch provides two main features:
An n-dimensional tensor that can run on GPUs.
Automatic differentiation for building neural networks.
The following code shows how to build a neural network with PyTorch:
from pyexpat import model #optionalfrom torch import nnclass Network(nn.Module):def __init__(self):super().__init__()# Inputs to hidden layerself.hidden = nn.Linear(600, 60)# Output layerself.output = nn.Linear(60, 6)# Defining sigmoid activation and softmax outputself.sigmoid = nn.Sigmoid()self.softmax = nn.Softmax(dim=1)def forward(self, tensor):# Pass the input tensor through each operationtensor = self.hidden(tensor)tensor = self.sigmoid(tensor)tensor = self.output(tensor)tensor = self.softmax(tensor)return tensormodel= Network()
Line 2: We import the nn
module of PyTorch that makes building neural networks easier.
Line 4–6: We'll inherit from the nn.Module
in the class Network
declaration. When it is combined with super().__init__()
a class is created that provides a lot of useful methods.
Line 9: We create a module for the linear transformation 600
inputs and gives 60
outputs, and assigns it to self.hidden
.
Note: This module automatically creates weights and bias tensors. We can access them using
net.hidden.weight
andnet.hidden.bias
.
Line 11: We create another linear transformation with 60
inputs and 6
outputs (10 units - one for each digit).
Line 14: This line contains the operation for sigmoid
activation.
Line 15: This line contains the operation for softmax
output. Set the dim=1
to calculate softmax
across the columns.
Line 17: Networks created nn.Module
must have a forward method. It takes in a tensor
as a parameter.
Line 19–22: We pass the tensor
through these operations, and its value is reassigned.
Line 25: A network is being created.
In this answer, we've learned how to build a neural network with PyTorch. In order to use this model, we'll have to train the model to learn the data set before using it to derive predictions.
Free Resources