Notebook

Modified "How to Get an Allocation" to better suit Sentdex

This notebook is a modified version of the original, found here: https://www.quantopian.com/posts/how-to-get-an-allocation-writing-an-algorithm-for-the-quantopian-investment-management-team

Main modifications:

  • Removal of SMA being applied to the signal, since the signal is already the return of SMA data.
  • Illustration of a different company, at a later date, to show both the general high-volatility of AAPL, but also avoiding the initial data, which, since it uses MAs to calc the signal, is going to be very noisy.
  • dates: This runs start_date='2015-01-01', end_date='2016-01-01'. This a full year that contains a neutral market overall, but also includes ups and downs that give us some good environments to test in.
  • New backtest attached via PyFolio which works straight off signals (removing the SMA factor, and then also not working with buckets since we can't currently organize those right).

Further issues:

  • Currently, the quantiles in Quantopian's Alphalens tear sheet (.tears.create_factor_tear_sheet) is not lending itself to proper sorting, and we have an issue where an entire bucket can be filled and never change rank, since many companies may fall in there. Alphalens already has a new property for bins which lets you explicitly define their edges (rather than evenly dividing your data). This is absolutely necessary to properly test this, since we do not have an even spread of sentiment signals. That said, with these dates, we can at least get away with 3 buckets like was initially shown, but this isn't quite how you'd trade this, and these buckets aren't really fairly divided.

The process involves the following steps:

  • Researching partner data.
  • Designing a pipeline.
  • Analyzing an alpha factor with Alphalens.
  • Implementing our factor in the IDE (see backtest in next comment).
  • Evaluating the backtest using Pyfolio.

Part 1 - Investigate the Data with Blaze

To start out, let's investigate a partner dataset using Blaze. Blaze allows you to define expressions for selecting and transforming data without loading all of the data into memory. This makes it a nice tool for interacting with large amounts of data in research.

In [12]:
import matplotlib.pyplot as plt
import pandas as pd

# http://blaze.readthedocs.io/en/latest/index.html
import blaze as bz

from zipline.utils.tradingcalendar import get_trading_days

from quantopian.interactive.data.sentdex import sentiment

Interactive datasets are Blaze expressions. Blaze expressions have a similar API to pandas, with some differences.

In [13]:
type(sentiment)
Out[13]:
<class 'blaze.expr.expressions.Field'>

Let's start by looking at a sample of the data in the Sentdex Sentiment Analysis dataset for AAPL.

In [14]:
aapl_sid = symbols('BAC').sid

# Look at a sample of AAPL sentiment data starting from 2013-12-01.
sentiment[(sentiment.sid == aapl_sid) & (sentiment.asof_date >= '2013-12-01')].peek()
Out[14]:
symbol sentiment_signal sid asof_date timestamp
0 BAC 5.0 700.0 2013-12-01 2013-12-02
1 BAC 5.0 700.0 2013-12-02 2013-12-03
2 BAC 2.0 700.0 2013-12-03 2013-12-04
3 BAC 6.0 700.0 2013-12-04 2013-12-05
4 BAC 2.0 700.0 2013-12-05 2013-12-06
5 BAC 2.0 700.0 2013-12-06 2013-12-07
6 BAC 6.0 700.0 2013-12-07 2013-12-08
7 BAC 6.0 700.0 2013-12-08 2013-12-09
8 BAC 6.0 700.0 2013-12-09 2013-12-10
9 BAC 2.0 700.0 2013-12-10 2013-12-11
10 BAC 6.0 700.0 2013-12-11 2013-12-12

Let's see how many securities are covered by this dataset between 12/2013 and 12/2014.

In [15]:
num_sids = bz.compute(sentiment.sid.distinct().count())
print 'Number of sids in the data: %d' % num_sids
Number of sids in the data: 586

Let's go back to AAPL and let's look at the sentiment signal each day. To do this, we can create a Blaze expression that selects trading days and another for the AAPL sid (24).

In [16]:
# Mask for trading days.
date_mask = sentiment.asof_date.isin(
    get_trading_days(pd.Timestamp('2014-06-01'), pd.Timestamp('2014-12-01'))
)

# Mask for AAPL.
stock_mask = (sentiment.sid == aapl_sid)

# Blaze expression for AAPL sentiment on trading days between 12/2013 and 12/2014
sentiment_2014_expr = sentiment[date_mask & stock_mask].sort('asof_date')

Compute the expression. This returns the result in a pandas DataFrame.

In [17]:
sentiment_2014_df = bz.compute(sentiment_2014_expr)

Plot the sentiment signal for AAPL.

In [18]:
sentiment_2014_df.plot(x='asof_date', y='sentiment_signal')
Out[18]:
<matplotlib.axes._subplots.AxesSubplot at 0x7ff00e097d90>

The sentiment signal tends to jump quite a bit. Let's try smoothing it by plotting the 5-day mean using the pandas.rolling_mean function. Note that we set the index of the Dataframe to be the asof_date so that the x-axis would be nicely formatted.

In [19]:
pd.rolling_mean(sentiment_2014_df.set_index('asof_date').sentiment_signal, window=5).plot()
Out[19]:
<matplotlib.axes._subplots.AxesSubplot at 0x7ff006210190>

Great! Now let's use this data in a pipeline.

Part 2 - Define Our Factor

Now that we have a dataset that we want to use, let's use it in a pipeline. In addition to the sentiment dataset, we will also use the EventVestor Earnings Calendar dataset to avoid trading around earnings announcements, and the EventVestor Mergers & Acquisitions dataset to avoid trading acquisition targets. We will work with the free versions of these datasets.

In [20]:
from quantopian.pipeline import Pipeline
from quantopian.research import run_pipeline

from quantopian.pipeline.factors import SimpleMovingAverage
from quantopian.pipeline.filters.morningstar import Q1500US

# Sentdex Sentiment free from 15 Oct 2012 to 1 month ago.
from quantopian.pipeline.data.sentdex import sentiment

# EventVestor Earnings Calendar free from 01 Feb 2007 to 1 year ago.
from quantopian.pipeline.factors.eventvestor import (
    BusinessDaysUntilNextEarnings,
    BusinessDaysSincePreviousEarnings,
)

# EventVestor Mergers & Acquisitions free from 01 Feb 2007 to 1 year ago.
from quantopian.pipeline.filters.eventvestor import IsAnnouncedAcqTarget

from quantopian.pipeline.factors import BusinessDaysSincePreviousEvent
In [21]:
def make_pipeline():
    
    # 5-day sentiment moving average factor.
    sentiment_factor = SimpleMovingAverage(inputs=[sentiment.sentiment_signal], window_length=5)
    
    # Filter for stocks that are not within 2 days of an earnings announcement.
    not_near_earnings_announcement = ~((BusinessDaysUntilNextEarnings() <= 2)
                                | (BusinessDaysSincePreviousEarnings() <= 2))
    
    # Filter for stocks that are announced acquisition target.
    not_announced_acq_target = ~IsAnnouncedAcqTarget()
    
    # Filter for stocks that had their sentiment signal updated in the last day.
    new_info = (BusinessDaysSincePreviousEvent(inputs=[sentiment.asof_date.latest]) <= 1)
    
    # Our universe is made up of stocks that have a non-null sentiment signal that was updated in
    # the last day, are not within 2 days of an earnings announcement, are not announced acquisition
    # targets, and are in the Q1500US.
    universe = (Q1500US() 
                & sentiment_factor.notnull() 
                & not_near_earnings_announcement
                & not_announced_acq_target
                & new_info)
    
    # Our pipeline is defined to have the rank of the sentiment_factor as the only column. It is
    # screened by our universe filter.
    pipe = Pipeline(
        columns={
            'sentiment': sentiment_factor.rank(mask=universe, method='average'),
        },
        screen=universe
    )
    
    return pipe
In [22]:
result = run_pipeline(make_pipeline(), start_date='2015-01-01', end_date='2016-01-01')

Part 3 - Analyze Our Factor Using Alphalens

Now we can analyze our sentiment factor with Alphalens. To do this, we need to get pricing data using get_pricing.

In [23]:
# All assets that were returned in the pipeline result.
assets = result.index.levels[1].unique()

# We need to get a little more pricing data than the length of our factor so we 
# can compare forward returns. We'll tack on another month in this example.
pricing = get_pricing(assets, start_date='2015-01-01', end_date='2016-02-01', fields='open_price')

Then we run a factor tearsheet on our factor. We will analyze 3 quantiles, looking at 1, 5, and 10-day lookahead periods.

In [24]:
import alphalens

alphalens.tears.create_factor_tear_sheet(factor=result['sentiment'],
                                         prices=pricing,
                                         quantiles=3,
                                         periods=(1,5,10))
Returns Analysis
1 5 10
Ann. alpha 0.036 0.036 0.033
beta -0.015 -0.044 -0.050
Mean Period Wise Return Top Quantile (bps) 1.262 1.797 1.638
Mean Period Wise Return Bottom Quantile (bps) -1.042 -1.143 -1.129
Mean Period Wise Spread (bps) 2.174 2.794 2.592
Information Analysis
1 5 10
IC Mean 0.010 0.020 0.027
IC Std. 0.062 0.064 0.066
t-stat(IC) 2.506 5.063 6.596
p-value(IC) 0.013 0.000 0.000
IC Skew -0.156 -0.087 -0.007
IC Kurtosis 0.112 -0.371 -0.664
Ann. IR 2.501 5.053 6.583
Turnover Analysis
1
Quantile 1 Mean Turnover 0.021
Quantile 2 Mean Turnover 0.259
Quantile 3 Mean Turnover 0.056
1
Mean Factor Rank Autocorrelation 0.977
<matplotlib.figure.Figure at 0x7ff00d3ce8d0>

From this it looks like there's a relationship between the top quantile of our factor and positive returns as well as the bottom quantile and negative returns.

Let's try to capitalize on this by implementing a strategy that opens long positions in the top quantile of stocks and short positions in the bottom quantile of stocks. Let's invest half of our portfolio and half short, and equally weight our positions in each direction.

Before moving to the IDE, let's make some small changes to the pipeline we defined earlier. This will properly invest based on signal, RATHER than quantiles.

In [25]:
def make_pipeline():
    
    # 5-day sentiment moving average factor.
    sentiment_factor = sentiment.sentiment_signal.latest
    
     # Filter for stocks that are not within 2 days of an earnings announcement.
    #not_near_earnings_announcement = ~((BusinessDaysUntilNextEarnings() <= 2)
    #                            | (BusinessDaysSincePreviousEarnings() <= 2))
    
    # Filter for stocks that are announced acquisition target.
    #not_announced_acq_target = ~IsAnnouncedAcqTarget()
    
    # Filter for stocks that had their sentiment signal updated in the last day.
    new_info = (BusinessDaysSincePreviousEvent(inputs=[sentiment.asof_date.latest]) <= 1)
    
    # Our universe is made up of stocks that have a non-null sentiment signal that was updated in
    # the last day, are not within 2 days of an earnings announcement, are not announced acquisition
    # targets, and are in the Q1500US.
    universe = (Q1500US() 
                & sentiment_factor.notnull() 
                & new_info)
    
    # A classifier to separate the stocks into quantiles based on sentiment rank.

    
    # Go short the stocks in the 0th quantile, and long the stocks in the 2nd quantile.
    pipe = Pipeline(
        columns={
            'sentiment': sentiment_factor,
            'longs': (sentiment_factor >=4),
            'shorts': (sentiment_factor<=2),
        },
        screen=universe
    )
    
    return pipe

Part 4 - Implement and Backtest the Strategy in the IDE.

See following post for the backtest.

Part 5 - Analyze Our Backtest Using Pyfolio (NEW BACKTEST)

Let's load our backtest result and run it through a tearsheet using pyfolio.

In [26]:
bt = get_backtest('5882a4b48aab61619ac8b0ba')
100% Time: 0:00:07|###########################################################|
In [27]:
bt.create_full_tear_sheet()
Entire data start date: 2015-01-02
Entire data end date: 2015-12-31


Backtest Months: 12
Performance statistics Backtest
annual_return 0.04
annual_volatility 0.02
sharpe_ratio 1.81
calmar_ratio 2.71
stability_of_timeseries 0.74
max_drawdown -0.01
omega_ratio 1.37
sortino_ratio 2.90
skew 0.47
kurtosis 3.53
tail_ratio 1.10
common_sense_ratio 1.14
information_ratio 0.01
alpha 0.04
beta 0.00
Worst Drawdown Periods net drawdown in % peak date valley date recovery date duration
0 1.42 2015-08-05 2015-09-16 2015-10-19 54
1 1.40 2015-01-15 2015-05-26 2015-07-16 131
2 0.83 2015-10-30 2015-11-18 2015-12-17 35
3 0.49 2015-12-17 2015-12-24 NaT NaN
4 0.20 2015-07-27 2015-07-28 2015-07-31 5

[-0.003 -0.006]
Stress Events mean min max
Fall2015 -0.00% -0.28% 0.21%
New Normal 0.02% -0.45% 0.65%
Top 10 long positions of all time max
CTXS-14014 1.01%
JBL-8831 1.01%
CMI-1985 1.00%
FLR-24833 1.00%
TXT-7674 1.00%
AEE-24783 1.00%
DISC_A-36930 1.00%
IR-4010 1.00%
NEM-5261 1.00%
MGM-4831 1.00%
Top 10 short positions of all time max
ARG-510 -0.16%
PRGO-6161 -0.16%
AZO-693 -0.15%
SD-35006 -0.15%
HUM-3718 -0.15%
WTW-23269 -0.15%
CLF-1595 -0.15%
BIIB-3806 -0.15%
SHW-6868 -0.15%
DNR-15789 -0.15%
Top 10 positions of all time max
CTXS-14014 1.01%
JBL-8831 1.01%
CMI-1985 1.00%
FLR-24833 1.00%
TXT-7674 1.00%
AEE-24783 1.00%
DISC_A-36930 1.00%
IR-4010 1.00%
NEM-5261 1.00%
MGM-4831 1.00%
All positions ever held max
CTXS-14014 1.01%
JBL-8831 1.01%
CMI-1985 1.00%
FLR-24833 1.00%
TXT-7674 1.00%
AEE-24783 1.00%
DISC_A-36930 1.00%
IR-4010 1.00%
NEM-5261 1.00%
MGM-4831 1.00%
GLW-3241 1.00%
VZ-21839 1.00%
PBI-5773 1.00%
UNH-7792 1.00%
ADT-43399 1.00%
EXC-22114 1.00%
AIV-11598 1.00%
AXP-679 1.00%
MDT-4758 1.00%
PGR-5950 1.00%
ADBE-114 1.00%
BKS-9693 1.00%
DG-38936 1.00%
EMC-2518 1.00%
KLAC-4246 1.00%
BWA-9514 1.00%
SYMC-7272 1.00%
COF-12160 1.00%
R-6326 1.00%
TDC-34661 1.00%
... ...
EL-13841 0.13%
MJN-38084 0.13%
WM-19181 0.13%
ANN-430 0.13%
PLL-6030 0.13%
STI-7152 0.13%
CTL-1960 0.13%
STJ-7156 0.13%
CAMP-1244 0.13%
CTAS-1941 0.13%
TMK-7488 0.13%
CMA-1620 0.13%
DDS-2126 0.13%
WY-8326 0.13%
NOV-24809 0.13%
HOG-3499 0.13%
NWL-5520 0.13%
POM-6098 0.13%
IP-3971 0.13%
BMS-975 0.13%
UNM-7797 0.13%
TEG-8264 0.13%
ECYT-40814 0.13%
XEL-21964 0.13%
MKC-4705 0.13%
IGT-3840 0.13%
PX-6272 0.13%
CFN-38691 0.13%
COV-34010 0.13%
KRFT-43405 0.13%

520 rows × 1 columns