# Multiple R vs. R-Squared: What’s the Difference?

When you fit a regression model using most statistical software, you’ll often notice the following two values in the output:

Multiple R: The multiple correlation coefficient between three or more variables.

R-Squared: This is calculated as (Multiple R)2 and it represents the proportion of the variance in the response variable of a regression model that can be explained by the predictor variables. This value ranges from 0 to 1.

In practice, we’re often interested in the R-squared value because it tells us how useful the predictor variables are at predicting the value of the response variable.

However, each time we add a new predictor variable to the model the R-squared is guaranteed to increase even if the predictor variable isn’t useful.

The adjusted R-squared is a modified version of R-squared that adjusts for the number of predictors in a regression model. It is calculated as:

Adjusted R2 = 1 – [(1-R2)*(n-1)/(n-k-1)]

where:

• R2: The R2 of the model
• n: The number of observations
• k: The number of predictor variables

Since R-squared always increases as you add more predictors to a model, adjusted R-squared can serve as a metric that tells you how useful a model is, adjusted for the number of predictors in a model.

To gain a better understanding of each of these terms, consider the following example.

### Example: Multiple R, R-Squared, & Adjusted R-Squared

Suppose we have the following dataset that contains the following three variables for 12 different students:

Suppose we fit a multiple linear regression model using Study Hours and Current Grade as the predictor variables and Exam Score as the response variable and get the following output:

We can observe the values for the following three metrics:

Multiple R: 0.978. This represents the multiple correlation between the response variable and the two predictor variables.

R Square: 0.956. This is calculated as (Multiple R)2 = (0.978)2 = 0.956. This tells us that 95.6% of the variation in exam scores can be explained by the number of hours spent studying by the student and their current grade in the course.

Adjusted R-Square: 0.946. This is calculated as:

Adjusted R2 = 1 – [(1-R2)*(n-1)/(n-k-1)] = 1 – [(1-.956)*(12-1)/(12-2-1)] = 0.946.

This represents the R-squared value, adjusted for the number of predictor variables in the model.

This metric would be useful if we, say, fit another regression model with 10 predictors and found that the Adjusted R-squared of that model was 0.88. This would indicate that the regression model with just two predictors is better because it has a higher adjusted R-squared value.

## 3 Replies to “Multiple R vs. R-Squared: What’s the Difference?”

1. Nguyễn thị nga says:

Regression Statistics
Multiple R 0.232364
R Square 0.053993
Standard Error 80.87655
Observations 10

ANOVA
df SS MS F Significance F
Regression 2 2613.291 1306.645 0.199762 0.823436
Residual 7 45787.11 6541.016
Total 9 48400.4

Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Lower 95.0% Upper 95.0%
Intercept -8.21116 265.3949 -0.03094 0.976182 -635.77 619.3479 -635.77 619.3479
giá 0.237483 1.017637 0.233367 0.822154 -2.16885 2.643812 -2.16885 2.643812
D 35.47569 56.12555 0.632077 0.547417 -97.2402 168.1915 -97.2402 168.1915

NHIỆM VỤ:
 VIẾT MÔ HÌNH ( TỔNG THỂ VÀ NGẪU NHIÊN )
 MIÊU TẢ MÔ HÌNH
 TINH CÁC SỐ CẦN TÍNH TRONG NHIỆM VỤ CŨ

2. TheStatistician says:

Please how can I calculate the multiple R? I need the formular.

3. Anshu Raj says:

How did you calculate the model summary table? Which library or tool did you use?