...

/

Optimizing the Perceptron Output

Optimizing the Perceptron Output

Learn how the perceptron output can be best optimized so that it finds the best possible boundary.

The perceptron trick

To find the best possible boundary, the perceptron algorithm should predict the output, compare it with the actual output, and learn the optimal weights for predicting the best possible fit function that separates the two classes.

📝 ** Question: How does the model learn?**

The model learns for a couple of iterations until it finds the best possible boundary that separates the two classes. An initial-boundary is drawn and then the error is computed. In each iteration, the boundary line moves in the direction so that it minimizes the error. This process continues until the error is below a certain threshold.

The following illustration will help you visualize this:

Quiz

1

What did you observe in the illustration above? Are we moving the point closer to the line or the line closer to the misclassified point?

A)

Line closer to the misclassified point

B)

Misclassified point closer to the line

Question 1 of 30 attempted

Predict the output

Recall the perceptron equation for making a boundary line:

w1x1+w2x2+b=0w_1x_1 + w_2x_2 + b = 0

In case of a step function, the prediction is given by:

yy' = { 1 if w1x1+w2x2+...wnxn+bw_1x_1 + w_2x_2 +... w_nx_n + b >= 0 and 0 otherwise}

In case of a sigmoid function:

yy' = { 1 if σ(w1x1+w2x2+...wnxn+b)\sigma (w_1x_1 + w_2x_2 +... w_nx_n + b) >= tt and 0 otherwise} where tt is the threshold.

Perceptron forward propagation operation using the step activation function

Revise the forward propagation in the code below:

Press + to interact
def step_function(weighted_sum): # step activation function
return (weighted_sum > 0) * 1
def sigmoid(x):
return 1 / (1 + np.exp(-x)) # applying the sigmoid function
def forward_propagation(input_data, weights, bias):
# take the dot product of input and weight and add the bias
return sigmoid(np.dot(input_data, weights) + bias) # the perceptron equation

Compare with actual output

We compared the predicted output with the actual output to see how well the perceptron forward propagation performed. This comparison is achieved by using the error function. ...