A Guide to Understanding the Concepts & Calculations
Variance is defined as "The average of the squared differences from the Mean". Variance is needed to compute the standard deviation.
Standard deviation is defined as "The square root of the variance".
Standard deviation and variance tells you how much a dataset deviates from the mean value.
A low standard deviation and variance indicates that the data points tend to be close to the mean (average), while a high standard deviation and variance indicates that the data points are spread out over a wider range of values.
The calculator below provides an interactive example, and will guide you through the calculation.
Calculate Standard Deviation & Variance:
Enter your data set below. Each number can be separated by a comma, space, or a new line break. e.g. enter 10,000 as 10000
Paste in as many values as you want!
Standard Deviation (Based on a population):

Standard Deviation (Based on a sample)

Variance (Population)

Variance (Sample)

Average

Total Numbers

Standard Error of the Mean

Empirical rule distributions (if sampling distribution of the mean follows normal distribution)
68% of the population is between:

95% of the population is between:

99.7% of the population is between:

68% of the sample is between:

95% of the sample is between:

99.7% of the sample is between:

How to Calculate Standard Deviation
Below is an example of 6 test scores from a class to walk through the calculation:
The above example can be condensed to the following formulas:
Population Standard Deviation (All elements from a data set  e.g 20 out of 20 students in class)
The population standard deviation, is used when the entire population can be accounted for. It is calculated by taking the square root of the variance of the data set. The following equation can be used in this scenario:
Sample Standard Deviation (One or more elements from a data set  but not 100% of elements  e.g 100 out of 300 students taking a computer class)
Sometimes, it is not possible to capture all data from a population. This requires the above equation be modified, and is used to calculate the sample standard deviation. This equation modifies the population standard deviation using the sample size as the size of the population. The “N1” is referred to as degrees of freedom. The version below is used in most basic statistical courses. While it is a better estimate compared to just using the population started deviation, still has significant bias for small sample sizes (less than 10)
\begin{align*}
s = \sqrt{\frac{\sum_{}(x_ix̄)^2}{6} }
\end{align*}
Where,
s = Sample standard deviation
∑ = Sum of..
xi
= An individual value..
x̄ = Population mean
n = Number of values in the sample data set
How to Interpret Standard Deviation
Standard deviation is a measure that is used to quantify the amount of variation or dispersion of a set of data values. The standard deviation is a description of the data's spread, how widely it is distributed about the mean. A smaller standard deviation indicates that more of the data is clustered about the mean. A larger one indicates the data are more spread out.
Generally speaking data is normally distributed. This is important as it can be inferred that normally distributed data follows a bell shaped curve. That bell shaped curve can give us further insights.
The above graph shows the rules for normally distributed data. 68.2% of responses are within 1 deviation of the mean, 95.4% of responses are within 2 deviations of the mean, while 99.6% of the data is within 3 deviations of the mean.
Example: If a question in your survey asks for annual income, the mean could be $35,000 with a standard deviation of $5,000. From the empirical rule, we could assume that 68% of total responses fall somewhere between $30,000 and $40,000. We could also assume 95% of the data falls between $25,000 and $45,000.
With this example, knowing your audience's income range can set you up for a successful marketing camping. You would now be able to create a campaign specific to your audience!
How to Interpret Variance
Variance also measures the amount of variation or dispersion of a set of data values from the mean.
As mentioned, variance takes the average of all the squared differences from the mean. Standard deviation takes the square root of that number. Thus, the only difference between variance and standard deviation, is the units. For example, if we took the times of 50 people running a 100meter race, we would capture their time in seconds. When we compute the variance, we come up with units in seconds squared. Seconds squared aren’t extremely useful, so to get back to regular second units we take a square root of the variance.
Single text boxes with number, dollar, or percent validation  Useful to gather income, age, or numbers which require analysis.
Continuous Sum gives deviation for each label  Useful to gather budget data, time allocated to projects, or other numerical allocation questions requiring analysis.