Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
loading static data without fetch_csv

I understand that there is a fetch_csv method but i was hoping for something a little bit nicer and something that would not disqualify me from the Quantopian contest.

I have some precomputed look-up-tables that i would like to load and test. they were built on data before 2013, so running them on a back test for 2013 to 2016 should not represent any data leakage. the tables are not really a time-series, and in that regard, i don't think it fits well into the fetch_csv method. and they are pretty big (20 MB each as CSV, and 7 MB each as pickle file), but they only need to be loaded once, and updated maybe once every few years.

I cannot build the data table using the Q platform because building the table requires all historical data and takes a few hours to run.

I would like to load a few user-defined csv files that are about 20 MB each. How would I do that without fetch_csv?

any help would be greatly appreciated. Thank you.

3 responses

I've worked on an algo with a limited amount of static data pasted into the algo. I've never seen a published limit to number of characters of an algo. You could try just pasting it in.

Also, it is kind of a whacky idea, but Q supports zlib, so I wonder if you could paste it in as zipped, and then unzip portions of it, as needed?

Grant, thanks for the reply. I thought the same thing, but pasting 20 MB of text might crash the system. =) a little bit afraid to try.

Well, I tried pasting ~ 10 MB of text into the backtester IDE (81920 lines), and it doesn't seem to work. Probably a non-starter.

I'm wondering if the Q research platform would allow you to reduce the number of entries in your look-up table to something manageable, that could then be pasted into the algo?