So what standard deviation can tell about an interval of data?
Simple.
It tells you how sparse the values of your variable are from the mean.
For example lets take an interval X = {1,2,3,4,5,6,7} it yields mean = 4 and Standard deviation = 2.16025 .
In the second version lets consider Y = {1,2,3,4,5,6,100}
This range of values gives a mean = 17.28571 and a Standard deviation = 36.51353.
Even the mean is not increasing "that much" the standard deviation is now much higher. That means that there are values (100) very far from the mean.
If you consider the median instead of the mean then the example is much more self-explanatory.
While the median is equal to 4 in both cases, the Standard deviation is much higher in the second case.
It also tells us that the median is much more reliable than the mean.