A one-way ANOVA is used to determine whether or not there is a statistically significant difference between the means of three or more independent groups.
When reporting the results of a one-way ANOVA, we always use the following general structure:
- A brief description of the independent and dependent variable.
- The overall F-value of the ANOVA and the corresponding p-value.
- The results of the post-hoc comparisons (if the p-value was statistically significant).
Here’s the exact wording we can use:
A one-way ANOVA was performed to compare the effect of [independent variable] on [dependent variable].
A one-way ANOVA revealed that there [was or was not] a statistically significant difference in [dependent variable] between at least two groups (F(between groups df, within groups df) = [F-value], p = [p-value]).
Tukey’s HSD Test for multiple comparisons found that the mean value of [dependent variable] was significantly different between [group name] and [group name] (p = [p-value], 95% C.I. = [lower, upper]).
There was no statistically significant difference between [group name] and [group name] (p=[p-value]).
The following example shows how to report the results of a one-way ANOVA in practice.
Example: Reporting the Results of a One-Way ANOVA
Suppose a researcher recruits 30 students to participate in a study. The students are randomly assigned to use one of three studying techniques for the next month to prepare for an exam. At the end of the month, all of the students take the same test.
The researcher then performs a one-way ANOVA to determine if there is a difference in mean exam scores between the three groups.
The following table shows the results of the one-way ANOVA along with the Tukey post-hoc multiple comparisons table:
Here is how to report the results of the one-way ANOVA:
A one-way ANOVA was performed to compare the effect of three different studying techniques on exam scores.
A one-way ANOVA revealed that there was a statistically significant difference in mean exam score between at least two groups (F(2, 27) = [4.545], p = 0.02).
Tukey’s HSD Test for multiple comparisons found that the mean value of exam score was significantly different between technique 1 and technique 2 (p = 0.024, 95% C.I. = [-14.48, -0.92]).
There was no statistically significant difference in mean exam scores between technique 1 and technique 3 (p=0.883) or between technique 2 and technique 3 (p=0.067).
Things to Keep in Mind
Here are a few things to keep in mind when reporting the results of a one-way ANOVA:
Use a descriptive statistics table.
It can be helpful to present a descriptive statistics table that shows the mean and standard deviation of values in each treatment group as well to give the reader a more complete picture of the data.
For example, SPSS produces the following descriptive statistics table that shows the mean and standard deviation of exam scores for students in each of the three study technique groups:
Only report post-hoc results if necessary.
If the overall p-value of the ANOVA is not statistically significant, then you will not conduct post-hoc multiple comparisons between groups. This means you obviously don’t have to report any post-hoc results in the final report.
If you do have to conduct post-hoc tests, the Tukey HSD test is the most commonly used one but occasionally you may use the Scheffe or Bonferroni test instead.
Round p-values when necessary.
As a general rule of thumb, the overall F value and any p-values in ANOVA results are rounded to either two or three decimal places for brevity.
No matter how many decimal places you choose to use, be sure to be consistent throughout the report.