Standard deviation

Standard deviation

The standard deviation is a measure of the average deviation within a batch. It is the most frequently used measure of variability, used to explain how tightly the observed values cluster around the mean. When batches are more spread out, they have larger standard deviations.

 

If you were to sum the deviation from the mean for each value in a batch, it will always equal zero. However, if you sum the squared deviation that isn’t the case, that gives you the variance. The standard deviation takes the square root of the variance, changing the unit of measurement back to something much easier to interpret.

 

Once you have the standard deviation of your sample, you can measure how many observations are within one standard deviation of the mean, how many within two standard deviations, etc. Standard deviation can easily be calculated within Excel.

 

In a perfect normal distribution, you would expect 68% of observations to fall within one standard deviation of the mean, 95% within two standard deviations and 99.7% within three standard deviations. For example with 49 balls in the National Lottery the mean is 25, assuming the results have a normal distribution you’d expect 68% of balls will be within one standard deviation of 25.