I'm using MSGARCH package in R. By return_data/Volatility(fit.model), I get the residuals. When I calculate the standard deviation of the residuals, it turns out that it's close to 1 for all residuals. However, the standard deviation in each regime differs greatly. It's about 0.7 in one and 1.6 in the other.
Is this reasonable? I imagine that the residual would have unit variance in each regime respectively, since in each regime it's filtered by a GARCH model.
This result can be reproduced by any data I tried, so it's more about how the function Volatility works.