# Symmetric Distribution: Definition + Examples

In statistics, a symmetric distribution is a distribution in which the left and right sides mirror each other.

The most well-known symmetric distribution is the normal distribution, which has a distinct bell-shape.

If you were to draw a line down the center of the distribution, the left and right sides of the distribution would perfectly mirror each other:

In statistics, skewness is a way to describe the symmetry of a distribution. This value can be negative, zero, or positive.

For symmetric distributions, the skewness is zero.

This is in contrast to left-skewed distributions, which have negative skewness:

This is also in contrast to right-skewed distributions, which have positive skewness:

### Properties of Symmetric Distributions

In a symmetrical distribution, the mean, median, and mode are all equal.

Recall the following definitions for each:

• Mean: The average value.
• Median: The middle value.
• Mode: The value that occurs most often.

In a symmetrical distribution, each of these values is equal to each other.

In each of the examples up to this point, we’ve used unimodal distributions as examples – distributions with only one “peak.” However, a distribution can also be bimodal and be symmetrical.

A bimodal distribution is a distribution that has two peaks.

Notice that if we drew a line down the center of this distribution, the left and right sides would still mirror each other.

For these distributions, the mean and the median are equal. However, the mode is located in the two peaks.

### Other Examples of Symmetric Distributions

Along with the normal distribution, the following distributions are also symmetrical:

The t-Distribution

The Uniform Distribution

The Cauchy Distribution

If you drew a line down the center of any of these distributions, the left and right sides of each distribution would perfectly mirror each other.

### Symmetric Distributions & The Central Limit Theorem

One of the most important theorems in all of statistics is the central limit theorem, which states that the sampling distribution of a sample mean is approximately normal if the sample size is large enough, even if the population distribution is not normal.

In order to apply the central limit theorem, a sample size must be sufficiently large. It turns out that the exact number for “sufficiently large” depends on the underlying shape of the population distribution.

In particular:

• If the population distribution is symmetric, sometimes a sample size as small as 15 is sufficient.
• If the population distribution is skewed, generally a sample size of at least 30 is needed.
• If the population distribution is extremely skewed, then a sample size of 40 or higher may be necessary.

Thus, the benefit of symmetric distributions is that we require smaller sample sizes to apply the central limit theorem when calculating confidence intervals or performing hypothesis tests.