Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Problem with 10+ years backtests / notebook?

Is anybody else having problems with long backtests? I have this algo that I ran from 2008 to present using the new backtest analysis and it crashes under Chrome. If I run under the old backtest construct, it finishes the job. However, if I ran the full tearsheet round trips notebook, it initially starts processing but then just hangs after awhile and wouldn't finish through!

4 responses

Yes, notebooks have been an issue for ever since there is an upper bound on server side memory. In recent months I have seen long backtests raising exceptions as well for unspecified reasons.

For notebooks, take a look at memory meter at the top, if it is going close to 100% then suddenly drops below 5% you have hit the server memory limitation. Nothing you can do, except use a smaller duration in the notebook.

@Leo M,

Thanks for your feedback. For tearsheet notebooks, it's exactly how you described it so I must have hit the server memory limitation. That sucks! Both you and I agree on the merits of long backtests to increase the level of confidence in algo performance consistency, so this limitation is somewhat disappointing. Yes, I've tried splitting the backtests to smaller durations and it works but I would have preferred to do it for the full duration of the backtest.

BTW, I put in a ticket for support regarding this issue but haven't gotten a response yet.

@James

Agree with everything you said.

No substitute to including trading costs and full period simulation.

I've been having similar issues even with relatively short backtests (< 1 year). I hope the recent implementation of memory usage limits is improved such that generating a tearsheet on a full historical backtest is possible. In my opinion this is essential in order to effectively assess an algorithms efficacy.

Hopefully, we hear something soon.