Standard deviation |
Standard deviation definition
Earlier, when we calculated the range, the answers were expressed in the same units as the data. For the variance, however, the units are the squares of the units of the data – for example, “squared dollars” or “dollars squared”. Squared dollars or dollars squared are not intuitively clear or easily interpreted. For this reason, we have to make a significant change in the variance to compute useful measure of deviation one that does not give us a problem with units of measure and thus is less confusing. According to the , definition of standard deviation, this measure is called the standard deviation, and it is the square root of the variance. The square root of 100 dollars squared is 10 dollars because we take the square root of both the value and the units in which it is measured. The standard deviation, then, is in units that are the same as the original data.
Let us now define standard deviation: The standard deviation is a numerical measure of the average variability of a data set around its mean. The standard deviation for a population is denoted by s (Greek lower case letter sigma) and standard deviation for a sample denoted by s.The mean deviation has a limitation that it ignores the sign of x - in the general case of an observation x. The standard deviation gets over this limitation by squaring x - . (x - )2 is positive whether or not x - is negative or positive. Note that population means data set comprising of 100% items under study, and sample means data set comprising of sample items drawn out of the population so that by studying the sample, inferences may be drawn about the population.