0
$\begingroup$

I am working on long term portfolio allocations. With larger investment horizon like 5-10 years, 20 years of data that I have is not enough. So I decided to do some bootstrapping to generate some time series and test the behavior of my allocation on that simulated time series.

My question is what is a good block size to choose ? How to set that ? If I am doing the rebalancing each quarter would that be helpful to decide ? In absence of returns autocorrelation but presence of variance autocorrelation (up to one month), I thought of block size of 21 days as reasonable.

Thankful for your help !

$\endgroup$
2
  • $\begingroup$ You ca try based on autocorrelation. There are also some method arch.readthedocs.io/en/stable/bootstrap/generated/… $\endgroup$ Commented Jan 19 at 17:20
  • $\begingroup$ I agree "block size of 21 days is reasonable". $\endgroup$ Commented Jan 20 at 19:38

1 Answer 1

2
$\begingroup$

You have part of the answer when you write that "presence of variance autocorrelation (up to one month), I though of block size of 21 days is reasonable". You need to preserve autocorrelations that matters to you. Unfortunately you not always know what variables to monitor the autocorrelations on.

For instance if you have a look at the momentum (Asness, Clifford S., Tobias J. Moskowitz, and Lasse Heje Pedersen. "Value and momentum everywhere" The journal of finance 68, no. 3 (2013): 929-985) and the mean reversion (Yeo, Joongyeub, and George Papanicolaou. "Risk control of mean-reversion time in statistical arbitrage" Risk and Decision Analysis 6, no. 4 (2017): 263-290) of the cross section of returns. Each time there predictability, there is a memory that, in principle, you should not break this time scale with blocks. If mean reversion is short term (less than 5 days), momentum is slower (12 months).

Of course you cannot take really too long block if you want to make more than one block! thus at the end of the day you have to choose (and know) the memory scale you decide to break. If you break the equity momentum time scale, you will miss the memory that supports it, and you need to think about the consequence on your portfolios.

Another point is the overlap or not of the blocks: you you make monthly blocks and you have 20 years, imagine you make a block every first day of the month, you will have $20\times12=240$ basic experiments. It is not a lot, so you can think about starting a block every week, in a sense you will have 3 times more experiment ($720$) but of course the variance of your obtained estimates will not drop by a $\sqrt{3}\simeq 1.3$ coefficient, because each added block will not be independent (i.e. two consecutive blocks will share 3/4 of their dates).

Nevertheless, you need also to be sure that starting every first day of a month does not generate any fragility. If there is a "end-to-start of a the month seasonality", you may break it.

To conclude, choosing the size of blocks request

  • not trying to have no "break of autocorrelation", but to be conscious of the one you break, somehow, choosing the block size is like a model choice,
  • have overlapping blocks, but do not think it creates too many new "independent trials"
$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.