Solution Review: Classification Using IRIS DataSet
An explanation of the classification of Iris-setosa using the IRIS dataset.
We'll cover the following...
Solution
Press + to interact
main.py
iris.csv
import numpy as npimport pandas as pddef step(weighted_sum): # step activation function"""The step activation is applied to the perceptron output thatreturns 0 if the weighted sum is less than 0 and 1 otherwise"""return (weighted_sum > 0) * 1def sigmoid(z):"""The sigmoid activation function on the input x"""return 1 / (1 + np.exp(-z))def forward_propagation(X, W, b):"""Computes the forward propagation operation of a perceptron andreturns the output after applying the step activation function"""weighted_sum = np.dot(X, W) + b # calculate the weighted sum of X and Wprediction = sigmoid(weighted_sum) # apply the sigmoid activation functionreturn predictiondef gradient(X, Y, Y_predicted):""""Gradient of weights and bias"""Error = Y_predicted - Y # Calculate errordW = np.dot(X.T, Error) # Compute derivative of error w.r.t weight, i.e., (target - output) * xdb = np.sum(Error) # Compute derivative of error w.r.t biasreturn dW, db # return derivative of weight and biasdef update_parameters(W, b, dW, db, learning_rate):"""Updating the weights and bias value"""W = W - learning_rate * dW # update weightb = b - learning_rate * db # update biasreturn W, b # return weight and biasdef train(X, Y, W, b):epochs = 10learning_rate = 0.1"""Training the perceptron using batch update"""for i in range(epochs): # loop over the total epochsY_predicted = forward_propagation(X, W, b) # compute forward passdW, db = gradient(X, Y, Y_predicted) # calculate gradientW, b = update_parameters(W, b, dW, db, learning_rate) # update parametersreturn W, b# Initializing values# Data retrieval and preparation.dataset = pd.read_csv("iris.csv") # read data from csvX = dataset.iloc[0:100, [0, 1, 2, 3]].values # featuresY = dataset.iloc[0:100, 4].values # labelsY = np.where(Y == 'Iris-setosa', 0, 1) # if value is iris setosa, assign it 0 and 1 otherwiselearning_rate = 0.5 # learning rateweights = np.array([0.0, 0.0, 0.0, 0.0]) # weights of perceptronbias = 0.0 # bias valueprint("Target value\n", Y)# Model trainingW, b = train(X, Y, weights, bias)# Predicting valueA2 = forward_propagation(X, W, b)print("Predicted value")Y_predicted = (A2 > 0.5) * 1print(Y_predicted)# Comparing predicted and target outcomecomparison = Y_predicted == Yequal_arrays = comparison.all()print("Y == Y_predicted:", equal_arrays)
Explanation
Training the perceptron
train
function:
-
It takes the features
X
, labelsY
,weights
,andbias
. -
It initializes the
learning_rate
to 0.5 andepochs
to 10. -
A for loop iterates
epochs
times while updating the weights and bias in a batch manner. Within the epoch:- It calls the
forward propagation
to compute the predicted value and saves the return value inY_predicted
.
- It calls the