Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Size limit to context object?

tl;dr My handle_data and scheduled functions aren't being called, and I think that might be because of an implicit limit to the size of the context object.

My value algorithm involves looking at historical data for a number of parameters at each time step. I had set up a pipeline to do so, and was running the pipeline in before_trading_start, but that function started timing out quite reliably. So instead I tried breaking up my pipeline into smaller components, and calling each step in a different call to before_trading_start. I am caching all the intermediate data in the context object, which makes it rather large. Now I find quantopian is not running my scheduled functions or handle_data. Is there a size limit in bytes to the size of the context object? If so, that would explain the behavior. And if so, it would be nice if this behavior was documented, and we got an exception, rather than a silent continuation.

Perhaps there needs to be another mechanism for caching data of this sort, rather than storing it in the context object itself?

Thanks,

Sunil

1 response

I think I mis-diagnosed the problem in my original message. It appears that if a backtest is attempted where either set_universe or update_universe isn't called before the first call to handle_data or a scheduled event handler, no event handler is ever called. This happens even if there are future calls to update_universe. Now I call update_universe with a dummy stock (AAPL to be precise) until I have completed my first computation. After that everything seems to work fine.

Sunil