In statistics, an **odds ratio** tells us the ratio of the odds of an event occurring in a treatment group to the odds of an event occurring in a control group.

Odds ratios appear most often in logistic regression, which is a method we use to fit a regression model that has one or more predictor variables and a binary response variable.

An **adjusted odds ratio** is an odds ratio that has been adjusted to account for other predictor variables in a model.

It’s particularly useful for helping us understand how a predictor variable affects the odds of an event occurring, *after* adjusting for the effect of other predictor variables.

The following example illustrates the difference between an odds ratio and an adjusted odds ratio.

**Example: Calculating Adjusted Odds Ratios**

Suppose we are interested in understanding whether a mother’s age affects the probability of having a baby with a low birthweight.

To explore this, we can perform logistic regression using age as a predictor variable and low birthweight (yes or no) as a response variable.

Suppose we collect data for 300 mothers and fit a logistic regression model. Here are the results:

To obtain the odds ratio for age, we simply need to exponentiate the coefficient estimate from the table: e^{0.173} = **1.189**.

This tells us that an increase of one year in age is associated with an increase of **1.189 **in the odds of a baby having low birthweight. In other words, the odds of having a baby with low birthweight are increased by **18.9%** for each additional yearly increase in age.

This odds ratio is known as a “crude” odds ratio or an “unadjusted” odds ratio because it has not been adjusted to account for other predictor variables in the model since it is the *only* predictor variable in the model.

But suppose we were interested in understanding whether a mother’s age *and* her smoking habits affect the probability of having a baby with a low birthweight.

To explore this, we can perform logistic regression using age and smoking (yes or no) as predictor variables and low birthweight as a response variable.

Suppose we collect data for 300 mothers and fit a logistic regression model. Here are the results:

Here is how to interpret the results:

**Age:** The adjusted odds ratio for age is calculated as e^{.045} = **1.046**. This means the odds of having a baby with low birthweight are increased by 4.6% for each additional yearly increase in age, assuming the variable *smoking* is held constant.

For example, suppose mother A and mother B are both smokers. If mother A is one year older than mother B, then the odds that mother A has a low birthweight baby are 1.046 times the odds that mother B has a low birthweight baby.

**Smoking**: The adjusted odds ratio for smoking is calculated as e^{.485} = **1.624**. This means the odds of having a baby with low birthweight are increased by 62.4% if the mother smokes (compared to not smoking), assuming the variable *age *is held constant.

For example, suppose mother A and mother B are both 30 years old. If mother A smokes during pregnancy and mother B does not, then the odds that mother A has a low birthweight baby are 62.4% higher than the odds that mother B has a low birthweight baby.

Note that the adjusted odds ratio for age is lower than the unadjusted odds ratio from the previous example. This is because when other predictor variables increase the odds of the response variable occurring, the adjusted odds ratio for a predictor variable already in the model will always decrease.

**Summary: Odds Ratio vs. Adjusted Odds Ratio**

An **odds ratio** (sometimes called a “crude” odds ratio) is useful for telling us how changes in one predictor variable affect the odds of some response variable occurring.

An **adjusted odds ratio** is useful for telling us how changes in one predictor variable affect the odds of a response variable occurring, *after* controlling for other predictor variables in a model.

**Additional Resources**

Introduction to Logistic Regression

How to Perform Logistic Regression in R

How to Perform Logistic Regression in Python