Just wanna know why the daily volatility is calculated as the standard deviation of the percent returns and not the average itself of the percent returns? Isn’t volatility a measure of how much a stock could move in a single day which technically should be the average of how much the stock has moved in the past whatever lookback period which we are taking? By deriving the standard deviation of percent returns, arent we calculating how much the volatility itself could change?
The reason why I find it odd to use standard deviation of percent returns is because it doesn’t make sense if the percent returns themselves don’t change over a period of time.
For example, let’s take a security that steadily increases by 1% every day and let’s say it does that for 10 days.
Then by my method, the volatility of the underlying should be 1% because that’s what it changes by everyday, right? But by the exchange’s method, the volatility is 0% because the percent change itself has not changed during this period. Therefore, the margin charged for this scrip should be 0% because anything multiple by 0 is 0. Whether you take 6 sigma or 10 sigma, it doesn’t matter. The result will be 0. So what’s the deal exactly? I think every trader should know what margins they are paying. That’s what made me curious.
So, could someone elucidate where exactly I am wrong.