Decision boundary for logistic regression

We have just figured out that whenever z equals zero, we are in the decision boundary. But z is given by a linear combination of features x1 and x2. If we work out some basic operations, we arrive at:

z = 0 = b + w1x1+ w2x2z \space = \space 0 \space = \space b \space + \space w_1x_1 + \space w_2x_2

w2x2 = b + w1x1-w_2x_2 \space = \space b \space + \space w_1x_1

x2 = bw2w1w2x1x_2 \space = \space -\dfrac{b}{w_2} - \dfrac{w_1}{w_2}x_1

Given our model (b, w1, and w2), for any value of the first feature (x1), we can compute the corresponding value of the second feature (x2) that sits exactly at the decision boundary.

Get hands-on with 1400+ tech skills courses.