Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
How do I load a trained (serialized) machine learning model into Quantopian for backtest prediction?

I am training a machine learning model on a very large dataset on AWS and then serializing the model into a file which is relatively small (about 1MB). I want to load the model file into Quantopian to make predictions. How do I do that?

2 responses

I assume that Quantopian will not allow loading such data for security reasons.
Maybe this solution will be interesting to you, XGBoost model deployment as .py file

Hi Killian,

It depends on what you want the Quantopian framework to ingest from your trained ML model. If you only want the predictions or signals of the off platform trained ML model then you can use Self Serve Data module: upload-your-custom-datasets-and-signals-with-self-serve-data
Hope this helps.