...
/Solution Review: Forward Propagation - 3 Layered Neural Network
Solution Review: Forward Propagation - 3 Layered Neural Network
Learn the feed-forward operation in a 3 layered neural network.
We'll cover the following...
Solution
Press + to interact
import numpy as npimport matplotlib.pyplot as pltdef sigmoid(z):"""Compute sigmoid values for each sets of scores in x"""return 1 / (1 + np.exp(-z))def softmax(x):"""Compute softmax values for each sets of scores in x"""return np.exp(x) / np.sum(np.exp(x), axis=1)def forward_propagation(x, w1, w2, w3, b1, b2, b3):"""Computes the forward propagation operation for the 3-layeredneural network and returns the output at the 2 hidden layersand the output layer"""net_h1 = np.dot(x, w1) + b1 # net output at the first hidden layerout_h1 = sigmoid(net_h1) # applying the sigmoid activation to the first hidden layer net outputnet_h2 = np.dot(out_h1, w2) + b2 # net output at the second hidden layerout_h2 = sigmoid(net_h2) # applying the sigmoid activation to the second hidden layer net outputnet_y = np.dot(out_h2, w3) + b3 # net output of the output layerout_y = softmax(net_y) # applying the softmax activation to the net output of output layerreturn out_h1, out_h2, out_y# Creating data set# Aa = [0, 0, 1, 1, 0, 0,0, 1, 0, 0, 1, 0,1, 1, 1, 1, 1, 1,1, 0, 0, 0, 0, 1,1, 0, 0, 0, 0, 1]# Bb =[0, 1, 1, 1, 1, 0,0, 1, 0, 0, 1, 0,0, 1, 1, 1, 1, 0,0, 1, 0, 0, 1, 0,0, 1, 1, 1, 1, 0]# Cc =[0, 1, 1, 1, 1, 0,0, 1, 0, 0, 0, 0,0, 1, 0, 0, 0, 0,0, 1, 0, 0, 0, 0,0, 1, 1, 1, 1, 0]# Creating labelsy =[[1, 0, 0],[0, 1, 0],[0, 0, 1]]# converting data and labels into numpy arrayx = np.array([a, b, c])# Labels are also converted into NumPy arrayy = np.array(y)np.random.seed(42) # seed function to generate the same random valuen_x = 30n_h1 = 5n_h2 = 4n_y = 3w1 = np.random.randn(n_x, n_h1)w2 = np.random.randn(n_h1, n_h2)w3 = np.random.randn(n_h2, n_y)b1 = np.zeros((1, n_h1))b2 = np.zeros((1, n_h2))b3 = np.zeros((1, n_y))out_h1, out_h2, out_y = forward_propagation(x, w1, w2, w3, b1, b2, b3)print("First Hidden layer output:\n", out_h1)print("Second Hidden layer output:\n", out_h2)print("Output layer:\n", out_y)
Explanation
π Note: The ...
Access this course and 1400+ top-rated courses and projects.