Variance and Standard Deviation

The deviations of scores about the mean of a distribution is the basis for most of the statistical tests we will learn. Since we are measuring how much a set of scores is dispersed about the mean we are measuring variability. We can calculate the deviations about the mean and express it as variance or standard deviation. It is very important to have a firm grasp of this concept because it will be a central concept throughout the course.

Both variance and standard deviation measures variability within a distribution. Standard deviation is a number that indicates how much on average each of the values in the distribution deviates from the mean (or center) of the distribution. Keep in mind that variance measures the same thing as standard deviation (dispersion of scores in a distribution). Variance, however, is the average squared deviations about the mean. Thus, variance is the square of the standard deviation.

Example

Here is an example of the variance formula in action.

The formula above is the best way to understand variance and standard deviation as a measure of variability about the mean. However, the formula is rather cumbersome when used in actual calculations. It is recommended that you use the computational version of the above formula when you actually do your calculations. Here is an example of the same data set using the computational variance formula.

Note that these formulas are used when we are dealing with populations (a rare event). In most cases you will want to divide the numerator of the equation by N-1 instead of by N. In this way, a sample can be used to estimate what the variance or standard deviation is for the population.