Standard Deviation versus Absolute Mean Deviation

One of the first things that any student of statistics learns is 2 popular measures of descriptive statistics: mean and standard deviation.



Has the approach to calculating Standard Deviation ever got you wondering about the need to square the distances from the mean in order to remove negatives instead of just using the average of the absolute values to eliminate negatives?  Well, you are certainly not alone.



As it turns out, squaring the distances from the mean and then calculating their square root to arrive at the Standard Deviation of a distribution is more as a result of convention than anything else.  In fact, there is a measure called the Absolute Mean Deviation that does not take the squared distances from the mean to eliminate negative values.  Instead, it just takes the absolute values of the differences from the mean and calculates the average of the sum of those values to determine deviation from the mean.



The convention of course is to use Standard Deviation in most cases instead of Absolute Mean Deviation and therefore it is much more popular as a descriptive statistic than Absolute Mean Deviation.  Here is an interesting article that discusses the difference between the two approaches and identifies situations where using the Absolute Mean Deviation may be advantageous.

Popular posts from this blog

Understanding And Interpreting Gain And Lift Charts

Creating a time series forecast using IBM SPSS Modeler

Introduction to Classification & Regression Trees (CART)