Applied Categorical & Nonnormal Data Analysis

Logistic Regression Models

Updated for Stata 11

OLS vs Logistic Scatterplots

Logistic regression models (also known as logit models) provide one approach to analyzing binary response variables. The goal in logistic regression is to model Pr(y=1 | x) = F(xβ). To do this we will make use of the logit transformation.

The log odds are modeled as a linear function of the predictors, g(x) = xβ. In the case of simple logistic regression, i.e., with a single predictor, g(x) = β0 + β1x. Thus, π(x) can be written

This can be shown to be true since exp() are just the odds,

The coefficients for logistic regression are estimated using maximum likelihood. Unlike least squares regression, in which, the coefficients can be estimated in a single pass, the coefficients for logistic regression are estimated through an iterative procedure. This is because the effects in OLS regression are linear while logistic regression the solutions are nonlinear in β0 and β1 The goal is to find the coefficients that make the data most likely. This is done by maximizing the likelihood function,

It is much easier mathematically, however, to work with the log of the likelihood function, Remember, the logit model is linear in log odds, xb0 + β1x, for the case with one predictor variable.

Thus, the odds would be exp(xb) = exp(β0 + β1x)
which can be rewritten as exp(β0)exp(β1x).

If we increase x by one we get exp(β0 + β1(x+1)) = exp(β0 + β1x+β1)
which, in turn, can be rewritten as exp(β0)exp(β1x)exp(β1).

Next, to compare the odds before and after adding one to x, we compute the odds ratio,

---------------------------------- = exp(β1),

that is, the odds ratio for a one unit change is just the exponentiated log odds coefficient.

Before we begin estimating some logit models let's play with the grlog command (findit grlog) to see how changes in the constant and logistic regression coefficient affect the predicted probabilities. Now let's begin with some very simple examples.

Intercept Only Example