Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Momentum Strategy with a Dynamic Universe

Slow & Steady wins the race... 50/200 day moving average strategy

I came up with this idea while getting frustrated with a dynamic portfolio momentum strategy. The usual idea is to pick stocks that that have the most momentum up or down and go long/short accordingly; here I select the stocks whose 50 day MA are the closest to its 200 day MA, I go long on the ones just above and short the ones just below. Here I select the portfolio every 10 trading days.

I also use the total number of stocks with their slowMA < fastMA Vs. slowMA > fastMA as an indication of the general market direction, and weight more to the long/short side accordingly. The wlong and wshort recorded variables are the %s invested on the long and short side (per stock).

The idea is that stocks with similar 50 & 200 day moving averages are more likely to be fairly valued and the algo will avoid some of the wild swings that plague momentum strategies. The 50/200 day crossover is also a very common signal, so stocks might be more likely to continue in the direction of the 50day MA because a lot of investors enter and exit positions at that threshold.

The backtests for this are pretty memory hungry so you can't do tests over a very long time or you will get an error before it's done. This test goes through the '08-'09 crash and into the bull market on the other side, it seems to handle the transition pretty (at least in this case).

I completely made this up so please share any thoughts, criticisms, and ideas to improve it.

David Edwards

5 responses

I am new to quantopian, but intuitively I would expect to get the same exact 'Cumulative performance' if I clone this algo and run it for the same period like you did. But for 2008-09-01 to 2010-01-20, I got 49.4% (see https://www.quantopian.com/algorithms/5561ba996241b52720000519/5561bc42c49d821051876e46) (as opposed to 127% that you got above).
Is my intuition wrong?

George: did you run it on minute data with 10k initial capital? that might affect results too.

Ah.. sorry, I didn't think of that. I used daily data in the first place, that's why I got 49.4%. Now that I used minute data, I've got 149.8% (still not equal, but closer to David's 127.4%). Thanks Johnny

Did you consider transaction costs?
What's the turnover ratio and maximum draw down? Is there any quantopian lib for these metrics?

I'm new to quantopian and have not started the first algo yet. I apologize if these questions are too naive. :)

It seems that there is a mistake. Please see the blow

def handle_data(context, data):
dt = get_datetime()

if not context.trade_manager.signal(dt):##<<Here the function signal just is given one param. But its declaration show that it should have two params.  
    return

The function signal is given one param. But its declaration show that it should have two params.