Why Is Logistic Regression Considered a Linear Model?
Learn why logistic regression is considered a linear model.
Logistic regression as a linear model
We mentioned previously that logistic regression is considered a linear model, while we were exploring whether the relationship between features and response resembled a linear relationship. Recall that we plotted groupby
/mean
of the EDUCATION
feature in the “Data Exploration” chapter, as well as for the PAY_1
feature in this chapter, to see whether the default rates across values of these features exhibited a linear trend. While this is a good way to get a quick approximation of how “linear or not” these features may be, here we formalize the notion of why logistic regression is a linear model.
What is a linear model?
A model is considered linear if the transformation of features that is used to calculate the prediction is a linear combination of the features. The possibilities for a linear combination are that each feature can be multiplied by a numerical constant, these terms can be added together, and an additional constant can be added. For example, in a simple model with two features, and , a linear combination would take the following form:
The constants can be any number, positive, negative, or zero, for i = 0, 1, and 2 (although if a coefficient is 0, this removes a feature from the linear combination). A familiar example of a linear transformation of one variable is a straight line with the equation y = mx + b. In this case, and . is called the intercept of a linear combination, which should be familiar from algebra.
What kinds of things are “not allowed” in linear transformations? Any other mathematical expressions besides what was just described, such as the following:
-
Multiplying a feature by itself; for example, or . These are called polynomial terms.
-
Multiplying features together; for example, . These are called interactions.
-
Applying non-linear transformations to features; for example, log and square root.
-
Other complex mathematical functions.
-
“If then” types of statements. For example, “if , then .”
However, while these transformations are not part of the basic formulation of a linear combination, they could be added to a linear model by engineering features, for example, defining a new feature, .
Sigmoid and logit
functions
Earlier, we learned that the predictions of logistic regression, which take the form of probabilities, are made using the sigmoid function. Taking another look here, we see that this function is clearly non-linear:
Why, then, is logistic regression considered a linear model? It turns out that the answer to this question lies in a different formulation of the sigmoid equation, called the logit
function. We can derive the logit
function by solving the sigmoid function for ; in other words, finding the inverse of the sigmoid function. First, we set the sigmoid equal to , which we interpret as the probability of observing the positive class, then solve for as shown in the following:
Here, we’ve used some laws of exponents and logs to solve for . You may also see logit
expressed as follows:
In this expression, the probability of failure, , is expressed in terms of the probability of success, , because probabilities sum to 1. Even though in our case, credit default would probably be considered a failure in the sense of real-world outcomes, the positive outcome (response variable = 1 in a binary problem) is conventionally considered “success” in mathematical terminology. The logit
function is also called the log odds, because it is the natural logarithm of the odds ratio, . Odds ratios may be familiar from the world of gambling, via phrases such as “the odds are 2 to 1 that team will defeat team .”
Logistic regression is a linear model
In general, what we’ve called capital in these manipulations can stand for a linear combination of all the features. For example, this would be in our simple case of two features. Logistic regression is considered a linear model because the features included in are, in fact, only subject to a linear combination when the response variable is considered to be the log odds. This is an alternative way of formulating the problem, as compared to the sigmoid equation. Putting the pieces together, the features look like this in the sigmoid equation version of logistic regression:
But they look like this in the log odds version, which is why logistic regression is called a linear model:
Because of this way of looking at logistic regression, ideally, the features of a logistic regression model would be linear in the log odds of the response variable. We will see what is meant by this in the next lesson.
Logistic regression is part of a broader class of statistical models called Generalized Linear Models (GLMs). GLMs are connected to the fundamental concept of ordinary linear regression, which may have one feature (that is, the line of best fit, y = mx + b, for a single feature, x) or more than one in multiple linear regression. The mathematical connection between GLMs and linear regression is the link function. The link function of logistic regression is the logit function we just learned about.
Get hands-on with 1400+ tech skills courses.