Implementation of Logistic Regression
This lesson will provide an overview of logistic regression and the steps involved in its implementation.
We'll cover the following
Quick overview of logistic regression
Machine learning generally involves predicting a quantitative outcome or a qualitative class. The former is commonly referred to as a regression problem. In the case of linear regression, this involves predicting a numeric outcome based on the input of continuous variables.
When predicting a qualitative outcome (class), the task is considered a classification problem. Examples of classification problems include predicting what products a user will buy or if a target user will click on an online advertisement.
Not all algorithms fit cleanly into this simple dichotomy, though, and logistic regression is a notable example. Logistic regression is part of the regression family as it involves predicting outcomes based on quantitative relationships between variables. However, unlike linear regression, it accepts both continuous and discrete variables as input and its output is qualitative. In addition, it predicts a discrete class such as “Yes/No” or “Customer/Non-customer”.
In practice, the logistic regression algorithm analyzes relationships between variables. It assigns probabilities to discrete outcomes using the Sigmoid function, which converts numerical results into an expression of probability between 0 and 1.0.
A value of 0 represents no chance of occurring, whereas 1 represents a certain chance of occurring. For binary predictions, you can assign two discrete classes with a cut-off point of 0.5. Anything above 0.5 is classified as class A, and anything below 0.5 is classified as class B.
Get hands-on with 1400+ tech skills courses.