Solution Review: Use the Sigmoid Activation Function
Learn to apply the sigmoid activation function in a feedforward neural network.
We'll cover the following...
Solution
Press + to interact
import numpy as npdef sigmoid(x):"""The sigmoid activation function"""return 1 / (1 + np.exp(-x)) # applying the sigmoid functiondef forward_propagation(input_data, weights, bias):"""Computes the forward propagation operation of a perceptron andreturns the output after applying the sigmoid activation function"""# take the dot product of input and weight and add the biasreturn sigmoid(np.dot(input_data, weights) + bias) # the perceptron equation# Initializing parametersX = np.array([2, 3]) # declaring two data pointsY = np.array([0]) # labelweights = np.array([2.0, 3.0]) # weights of perceptronbias = 0.1 # bias valueoutput = forward_propagation(X, weights.T, bias) # predicted labelprint("Forward propagation output:", output)Y_predicted = (output > 0.5) * 1 ## apply sigmoid activationprint("Label:", Y_predicted)
Explanation
sigmoid
function:
For the given input value x
, ...
Access this course and 1400+ top-rated courses and projects.