Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Trading the High-Yield, Low-Volatility Stocks of the S&P500

This is my first contribution to the community here, exploring a simple method to squeeze couple of drops of alpha from the market.

The main principle behind this approach is:

  1. Screen the S&P 500 universe
  2. Select stocks with highest yield (dividend + buyback yield) which also have the lowest three year volatility
  3. Create an equal-weighted portfolio of these stocks, rebalance every month

This creates a basket of stocks (typically between 20 and 60) that tend to show both higher returns (outperfroms the market by 57% in 2005-2016 test) while having both lower volatility and drawdown (24% less drawdown).

Interestingly it also outperforms the two popular low volatility ETFs, the S&P Low Volatility (SPLV) and S&P High Yield Low Volatility (SPHD).

The algorithm has relatively low beta of 0.7 given it does rely on the same stocks that (roughly) compromise the S&P index.

As of November 2016 there are 20 stocks held by the system, listed below:

MCD
NLY
TRV
XL
SYY
UTX
MSI
ABC
CVS
AIG
AFL
MMM
TGT

I was hoping to be able to replicate results from this post where I got the inspiration from, however there is a difference in returns that seems insurmountable.

I'd like to thank the entire Quantopian community for sharing the knowledge from which the code borrows a lot and Meb Faber who inspired me to think more critically about the finance world.

17 responses

According to the link you provided, it seems you probably aren't as far off as you think. If you notice they get their best results when they filter the list of high yeild low volitility stocks to the top 7. You are buying the whole list (up to 66 stocks) and so your results more reflect the high dividend low volitility index which has much more modest returns.

Good spot. Using only the top 7 stocks does improve the overall returns by 15% but not to the levels they claim in the post.

I think the link mentioned they rebalance every 3 months. I altered your code to reflect that. Results are actually worse than the monthly rebalance.

Have you tried to get volatility defined as nu=duration/1+i?

since this strategy is more like a fixed income security strategy.

Also, the original algo screens the SP500. How close does the Q500US universe track the SP500 composition historically? I know you state this is an approximation, but any stats how closely it approximates? This assumption seems shaky ground to base your analysis because you cannot possibly isolate causes for the difference to data or code... even if the results were closer it should not be a solid reassurance. Have you read the discussion about possible issues of Q500US? (https://www.quantopian.com/posts/the-q500us-and-q1500us)

Hi All,

I tried to do the backtest for the strategy, but it takes very long time to finish.

Is there any way to speed up the backtest of the strategy.

Thanks

Kevin

Vladimir. Impressive first contribution!

I'm nearly certain the difference will be the Q500US. Despite its name it is not modelled on the S&P500. It is the most liquid 500 stocks in the US market. The S&p is the highest 500 market cap stocks.

You should try:

Universe = Q500US() & MarketCap().top(500)  

Or even

Universe = Q1500US() & MarketCap().top(500)  

@Dan thats a nice hack to get SP500!

@Dan unfortunately does not provide better results.

I have noticed a bug in the way volatility is calculated in the original algo. Here is an update and now it beats S&P500 by almost 80% over the backtest period.

vladimir,

Make no doubt that in original algo, volatility is calculated correctly:

vol = np.log(close).diff().std()  

If you want to use pct_change()
then the formula should be:

vol = close.pct_change().std()  

Thanks for that.

These algorithms are suffering from a common problem, partial fills. Here they are in the neighborhood of 370,000 by count. The ordering method order_target_percent() makes the instructions to do buying and selling of given shares that would keep everything in balance, however, as early as minute 31 on 2005-03-01 some incomplete selling exceeds the incomplete buying. Specifically an order to sell 2059 shares of ETR only sells 2 in that minute, then another 347 in minute 32, 57 in minute 33 and so on. Margin happens. It is carried over multiple days for a total of over 100k overnight margin throughout the backtest. Whoever models the costs should receive an award.

It would be great to say, hold those buy orders on the conveyor belt until more sells are filled and the cash therefore available.

One route would be to queue the orders, determine which are increases in allocation vs decrease, take care of the selling first, give them some time to complete, decide what to do with those that do not close entirely by a certain time, and then buy with available cash, safeguarding against wild price swings in the wrong direction. Folks who favor margin could then add margin if they want to in a controlled manner just buying each stock's ratio against context.portfolio.cash + whatever margin.

@Vladimir Hmm you are right it was already log. Any explanation why this produces better results and what does it actually measure?

This doesn't sound right to me. ETR is very illiquid back in 2005:

https://www.marketvolume.com/stocks/averagevolume.asp?s=ETR&t=entergy

I wonder why it's included in Q500US? This is supposed to be the most liquid stocks. It's worth asking [email protected] if it's not obvious

S&P isn't just the top 500 stocks. It's picked on a comittee based on:

(From Wikipedia) The components of the S&P 500 are selected by a committee. This is similar to the Dow Jones Industrial Average, but different from others such as the Russell 1000, which are strictly rule-based. When considering the eligibility of a new addition, the committee assesses the company's merit using eight primary criteria: market capitalization, liquidity, domicile, public float, sector classification, financial viability, length of time publicly traded and stock exchange.[5]

The committee selects the companies in the S&P 500 so they are representative of the industries in the United States economy. In order to be added to the index, a company must satisfy these liquidity-based size requirements:[5]

market capitalization is greater than or equal to US$6.1 billion
annual dollar value traded to float-adjusted market capitalization is greater than 1.0
minimum monthly trading volume of 250,000 shares in each of the six months leading up to the evaluation date.
The securities must be publicly listed on either the NYSE (including NYSE Arca or NYSE MKT) or NASDAQ (NASDAQ Global Select Market, NASDAQ Select Market or the NASDAQ Capital Market). Securities that are ineligible for inclusion in the index are limited partnerships, master limited partnerships, OTC bulletin board issues, closed-end funds, ETFs, ETNs, royalty trusts, tracking stocks, preferred stock, unit trusts, equity warrants, convertible bonds, investment trusts, ADRs, ADSs and MLP IT units.[5]

The index includes non-U.S. companies, both formerly U.S.-incorporated companies that have re-incorporated outside the United States, as well as firms that have never been incorporated in the United States.

Who test in live ?

For this strategy, is it possible to introduce a short code such that when the market experiences a downturn (e.g. financial crisis), the returns won't be negative?

Three years after the original post, here is an update. The algo continued to do well, narrowly beating s&p500 in out-of-sample for last three years. 15 year backtest looks pretty good. Double the returns of market, with lower volatility and drawdowns.

Algo updated for python 3.5.