Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Error Submitting for Contest - Time Out

I made a simple algorithm that buys when the security is below the second standard deviation for the 20 day moving average then sells when it is above the 20 day moving average but it took more than two hours to backtest so it failed the backtest. What are some common errors to keep the code fast?

from quantopian.algorithm import attach_pipeline, pipeline_output  
from quantopian.pipeline import Pipeline  
from quantopian.pipeline.data.builtin import USEquityPricing  
from quantopian.pipeline.factors import AverageDollarVolume  
from quantopian.pipeline.filters.morningstar import Q500US  
import pandas

sma_day = 20  
trade_cushion = .02  
sd = 2

trade = list()

def initialize(context):  
    schedule_function(my_rebalance, date_rules.every_day(), time_rules.market_open())  
    schedule_function(my_record_vars, date_rules.every_day(), time_rules.market_close())  
    attach_pipeline(make_pipeline(), 'my_pipeline')  
def make_pipeline():  
    base_universe = Q500US()  
    pipe = Pipeline(  
        screen = base_universe,  
    )  
    return pipe  
def before_trading_start(context, data):  
    del trade[:]  
    context.output = pipeline_output('my_pipeline')  
    context.security_list = context.output.index  
    for security in context.security_list:  
        close = data.history(security, "price", 1, "1d")  
        sma = data.history(security, "price", sma_day, "1d")  
        mean = sma.mean()  
        stddev = sma.std()  
        lower_bb = mean - (sd*stddev)  
        if close[0] < lower_bb:  
            trade.append(security)  
        record(avalible_positions = len(trade))  
def my_assign_weights(context, data):  
    pass  
def my_rebalance(context,data):  
    cpp = context.portfolio.positions  
    record(open_positions = len(cpp))  
    for security in cpp:  
        if data.history(security, "price", sma_day, "1d").mean() <= data.current(security, "price"):  
            if len(get_open_orders(security)) == 0:  
                order(security, -cpp[security]["amount"])  
    day_cash = context.portfolio.cash  
    if day_cash > 0:  
        if len(trade) > 0:  
            if len(cpp) >= 25:  
                if len(trade)+len(cpp) >= 50:  
                    cash_position = day_cash/len(trade)  
                if len(trade)+len(cpp) < 50:  
                    cash_position = day_cash/50  
            if len(cpp) < 25:  
                cash_position = day_cash/50  
            for security in trade:  
                close = data.history(security, "price", 1, "1d")  
                quan = cash_position//close[0]  
                if security not in cpp:  
                    order(security, quan, style=LimitOrder(close[0]*(1+trade_cushion)))  
    pass  
def my_record_vars(context, data):  
    pass  
def handle_data(context,data):  
    cpp = context.portfolio.positions  
    for security in cpp:  
        if data.history(security, "price", sma_day, "1d").mean() <= data.current(security, "price"):  
            if len(get_open_orders(security)) == 0:  
                roi = (data.current(security, "price") - cpp[security]["cost_basis"])*100/cpp[security]["cost_basis"]  
                log.info(roi)  
                order(security, -cpp[security]["amount"])  
        if (data.current(security, "price") - cpp[security]["cost_basis"])*100/cpp[security]["cost_basis"] <= -3:  
            if len(get_open_orders(security)) == 0:  
                roi = (data.current(security, "price") - cpp[security]["cost_basis"])*100/cpp[security]["cost_basis"]  
                log.info(roi)  
                order(security, -cpp[security]["amount"])  
    pass
3 responses

removed the obsolete logs and calculations but I think it could get faster if I was using the pipeline for filters ?

The data.history() calls are pretty expensive. You're calling data.history for each stock, one by one, and you're calling it multiple times per day - the history doesn't change.

You should call data.history() just once per day, and for all stocks, in a single call and save it for the day. Then you can re-use that dataframe over and over, slicing into it as needed. Call it again the next day.

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

Look for import and SimpleMovingAverage to move all of the history stuff including lower_bb in before_trading_start to pipeline using a class. Not easy yet worth it.
The last values from '1d' close history if run on minute 10 are the prices from minute 9.