Invasion of the Sigmoids
Explore how sigmoid functions adapt linear regression for binary classification by constraining outputs between 0 and 1 using logistic regression. Understand the implementation of the sigmoid in Python, the concept of forward propagation, and the importance of choosing log loss to optimize gradient descent. This lesson equips you with foundational knowledge to build and train binary classifiers effectively.
We'll cover the following...
Overview of sigmoids
Even though linear regression is not a natural fit for binary classification, that does not mean that we have to scrap our linear regression code and start from scratch. Instead, we can adapt our existing algorithm to this new problem using a technique that statisticians call logistic regression.
Let’s start by looking back at , the weighted sum of the inputs that we introduced in the lesson: Adding More Dimensions.
In linear regression, could take any value. Binary classification, however, imposes a tight constraint that must not drop below nor raise above . Here’s an idea: maybe we can find a function that wraps around the weighted sum and constrains it to the range ...