Not sure if I shared this already. I cleaned it up, and got the performance to look really good. I'd appreciate feedback, particularly on how to ensure that the optimizer is working properly. --Grant
Not sure if I shared this already. I cleaned it up, and got the performance to look really good. I'd appreciate feedback, particularly on how to ensure that the optimizer is working properly. --Grant
Hey Grant, thanks for sharing this exceptionally clean implementation. From the original OLMAR I remember there were two different versions. Is this the basic or BuyAndHold one?
Alex,
This is the simple version. There may be an advantage to the so-called BAH(OLMAR) version, since it weights the portfolio using a range of trailing window lengths for computing the mean. It might smooth out the returns, and has the added benefit of effectively eliminating a parameter that can be over-fit.
Note that the optimize function may be too inefficient to complete within the 50 seconds allotted per minute bar, if it has to be run 20 or 30 times to implement the BAH(OLMAR) approach. Something to watch out for, if you start to play around with it.
Grant
very nice work, thanks for that. Please note that the algorithem seems to correlate with the market and in turmoils such as between Sep-2008 to Jan-2010 and between Mar-2004 to July-2004 the algo shows big drawdown and results that are worse than the benchmark.
Here's an update, to fix a bug (sorta). I was not normalizing the current-state portfolio properly:
iff denom == 0.0:
b_t = np.copy(b_0)
else:
context.b_t = np.divide(b_t,denom)
I changed to:
if denom == 0.0:
b_t = np.copy(b_0)
else:
b_t = np.divide(b_t,denom)
I also tweaked context.eps to 1.005 to get decent return, and the optimizer had to be tweaked, as well (ftol is now 1e-6, which I think is the default). There were a few other minor changes that don't affect the results.
Here's a long-term backtest using the code I posted immediately above. Obviously, there are some things to understand, but there appears to be "workiness."
Grant,
Many thanks for posting this updated version -- looks great and much leaner.
I'd like to highlight one trick in case anyone else is interested. Often the minute price can be quite noisy because of sub-minute price swings (Ernie Chan gave a great talk on this at QuantCon: https://vimeo.com/122492697). Here, Grant is instead using an exponentially weighted average price:
prices = history(8*390,'1m','price')
prices = pd.ewma(prices,span=390).as_matrix(context.stocks)
Thomas
The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.
Here's a bit more of a revision of the code, to include the weighting over a range of trailing window lengths (the so-called "BAH(OLMAR)" mentioned by Alex above). I included the means to adjust the leverage, and the ability to include the inverse ETF, SH, in the portfolio. At some point, I can add a bunch of comments, if there is interest. --Grant
Grant, thanks so much for sharing. I'm currently studying the OLMAR paper and your implementations and hope to contribute back with variations of my own once I understand things a little better. I'd be very interested in your adding more comments to your code if you have time.
One thing I'd like to understand better is why you schedule trade() for 60 minutes after market open. I tried modifying that to 5 minutes and got similar results. When I dropped it down to 1 minute or 0 (no minutes parameter), I got a runtime error:
KeyError: Security(44933, symbol='FOXA_V', security_name='NEWS CORP', exchange='NASDAQ GLOBAL SELECT MARKET', start_date=Timestamp('2013-06-19 00:00:00+0000', tz='UTC'), end_date=Timestamp('2013-06-28 00:00:00+0000', tz='UTC'), first_traded=None)
Sorry if this is too much of a beginner question -- I'm still getting up to speed on everything.
Hello A. Roy,
I may get the chance tomorrow to add some comments. If you don't hear back in a day or two, just re-post here to pester me.
Interesting that you are getting an error; I get the same thing, so I sent the fine folks at Quantopian an e-mail with access to the code so that they can have a look. I thought I'd wrung out all of the problems!
Grant
@ A. Roy,
Well, I made a change and managed to avoid the error you found, but I can't say that I fully understand it. In initialize(context), I added:
context.data = []
and then in before_trading_start(context), I added:
# check if data exists
for stock in context.stocks:
if stock not in context.data:
context.stocks.remove(stock)
In handle_data(context, data), I now have:
context.data = data
I think that the algo gets tripped up on stocks that are added to the universe, via the code in before_trading_start(), if the stock doesn't trade first-thing in the day, but I'm not sure--I thought I'd guarded for that elsewhere.
Eventually, I'll add comments to the code, but I'd like to understand this weird problem first, since there may be some restructuring of the code.
Grant
Grant - try something like this. Instead of placing the order immediately, when you place an order, queue it up and then clear the queue as stocks become trade-able.
In initialize
context.order_queue = {}
Make a new function clear_orders
def clear_orders(context, data):
for stock in context.order_queue.keys():
if context.order_queue[stock]:
if stock in data:
context.order_queue[stock]['func'](stock,context.order_queue[stock]['amt'])
context.order_queue[stock] = None
In your allocate function, instead of order_target_percent
context.order_queue[stock] = {'func': order_target_percent,
'amt': context.leverage*desired_port[i])}
clear_orders(context, data)
In handle_data
clear_orders(context, data)