Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Realised volatility function

Hi,

I am trying to code the a strategy that involves trading at market close based on the realised volatility of the security over the past X days (say 20).

Security_Exposure(t) = 20% / Realised_Volatility(t-1)

The intention is to have a strategy that tracks the security but targets a realised volatility of 20%.

I saw the code below for a daily interval algorithm but I do not know how to use that when I need to trade at market close.

Thanks,
Brian

vol = compute_volatility(price_history)  
# Get historical price data  
@batch_transform(refresh_period=1, window_length=20)  
def get_past_prices(data):  
    prices = data['price']  
    return prices

# Compute historical volatility  
def compute_volatility(price_history):  
    # Compute daily returns  
    daily_returns = price_history.pct_change().dropna().values.T  
    # Compute daily volatility  
    historical_vol_daily = std(daily_returns)  
    # Convert daily volatility to annual volatility, assuming 252 trading days  
    historical_vol_annually = historical_vol_daily*sqrt(252)  
    # Return estimate of annual volatility  
    return 100*historical_vol_annually  
9 responses

Hello Brian,

I recommend using the history utility (https://www.quantopian.com/help#ide-history), instead of the batch transform (which will be deprecated).

I took a quick stab at an example (see attached). No guarantee that I did not botch it, but it should get you started. One fundamental question is how to compute the volatility per industry standard. On http://en.wikipedia.org/wiki/Volatility_%28finance%29, it states that the logarithmic returns should be used (see http://en.wikipedia.org/wiki/Rate_of_return#Logarithmic_or_continuously_compounded_return).

Note that the code could be improved by keying the volatility to sid (or by maintaining the pandas data structure, rather than converting to a numpy array). Note that history sorts the columns by ascending sid id number, so if you add more sids, don't expect compute_volatility to maintain the order in context.stocks (the present ordering of results is just coincidental).

Grant

import numpy as np  
import math

def initialize(context):  
    context.stocks = [sid(8554),sid(33652)]  
    context.day = 0

def handle_data(context, data):

    price_history = history(21, '1d', 'price')[0:-1]  
    if get_datetime().day != context.day:  
        vol = compute_volatility(price_history)  
        record(vol_0 = vol[0], vol_1 = vol[1])  
    context.day = get_datetime().day  
# Compute historical volatility  
def compute_volatility(price_history):  
    # Compute daily returns  
    daily_returns = price_history.pct_change().dropna().values  
    # Compute daily volatility  
    historical_vol_daily = np.std(daily_returns,axis=0)  
    # Convert daily volatility to annual volatility, assuming 252 trading days  
    historical_vol_annually = historical_vol_daily*math.sqrt(252)  
    # Return estimate of annual volatility  
    return 100*historical_vol_annually  

Thanks Grant, I took your code and tried to insert the rebalancing code there. I am getting an error and I cant figure out where it has gone wrong.

I am sorry its probably very trivial but I am totally new to this.

import numpy as np  
import math  
from pytz import timezone  
import datetime


def initialize(context):  
    context.stock = sid(8554)  
    context.day = 0  
def handle_data(context, data):

    #set timezone to EST  
    exchange_time = get_datetime().astimezone(timezone('US/Eastern'))  
    price_history = history(21, '1d', 'price')  
    if get_datetime().day != context.day:  
        vol = compute_volatility(price_history)  
        record(vol_0 = vol)  
    context.day = get_datetime().day  

    #Rebalance at 3:50PM  
    if exchange_time.hour == 15 and exchange_time.minute == 50:  
        order_target_percent( context.stock, 20 / vol )  
    # Compute historical volatility  
def compute_volatility(price_history):  
    # Compute daily returns  
    daily_returns = price_history.pct_change().dropna().values  
    # Compute daily volatility  
    historical_vol_daily = np.std(daily_returns,axis=0)  
    # Convert daily volatility to annual volatility, assuming 252 trading days  
    historical_vol_annually = historical_vol_daily*math.sqrt(252)  
    # Return estimate of annual volatility  
    return 100*historical_vol_annually

Hi Grant,

Doesn't your code only take the volatility of the previous day and use that to extrapolate over the entire year? I may be misinterpreting the [0:-1]

Hello Ken,

You can see from the attached backtest that the last array element (or row in a dataframe) is dropped by applying the [0:-1] suffix.

Grant

2014-03-05PRINT[ 147.3 144.4 146.3 141.6 138.5 137.2 136.3]  
2014-03-05PRINT[ 147.3 144.4 146.3 141.6 138.5 137.2]  

Brian,

Here's some revised code that runs, along with a backtest (warning: 30 minutes to run).

Grant

import numpy as np  
import math  
from pytz import timezone  

set_commission(commission.PerShare(cost=0))  
set_slippage(slippage.FixedSlippage(spread=0.00))

def initialize(context):  
    context.stock = sid(8554)  
def handle_data(context, data):

    #set timezone to EST  
    exchange_time = get_datetime().astimezone(timezone('US/Eastern'))  
    price_history = history(21, '1d', 'price')

    #Rebalance at 3:50PM  
    if exchange_time.hour == 15 and exchange_time.minute == 50:  
        volume = float(compute_volatility(price_history))  
        target_percent = 20 / volume  
        if target_percent <= 1:  
            record(tgt_pct = 100*target_percent)  
            order_target_percent( context.stock, target_percent )  
    # Compute historical volatility  
def compute_volatility(price_history):  
    # Compute daily returns  
    daily_returns = price_history.pct_change().dropna().values  
    # Compute daily volatility  
    historical_vol_daily = np.std(daily_returns,axis=0)  
    # Convert daily volatility to annual volatility, assuming 252 trading days  
    historical_vol_annually = historical_vol_daily*math.sqrt(252)  
    # Return estimate of annual volatility  
    return 100*historical_vol_annually

Thanks Grant. Its working properly now. The strategy is now supposed to track SPY, but with varying exposure to achieve a specified historical volatility (in my case 12%). Its not a quant strategy per se but I know some quant funds apply this technique to their overall portfolio to control (target) its volatility. Notice I put a cap on the leverage as sometimes you get extremely low realised volatility and you dont want to be super leverage.

import numpy as np  
import math  
from pytz import timezone  

set_commission(commission.PerShare(cost=0))  
set_slippage(slippage.FixedSlippage(spread=0.00))

def initialize(context):  
    context.stock = sid(8554)  
def handle_data(context, data):

    #set timezone to EST  
    exchange_time = get_datetime().astimezone(timezone('US/Eastern'))  
    price_history = history(21, '1d', 'price')

    #Rebalance at 3:50PM  
    if exchange_time.hour == 15 and exchange_time.minute == 50:  
        volume = float(compute_volatility(price_history))  
        target_volatility = 12  
        target_percent = min( 1.50, target_volatility / volume )  
        if target_percent >= 0.0:  
            record( tgt_pct = 100*target_percent)  
            order_target_percent( context.stock, target_percent )  
    # Compute historical volatility  
def compute_volatility(price_history):  
    # Compute daily returns  
    daily_returns = price_history.pct_change().dropna().values  
    # Compute daily volatility  
    historical_vol_daily = np.std(daily_returns,axis=0)  
    # Convert daily volatility to annual volatility, assuming 252 trading days  
    historical_vol_annually = historical_vol_daily*math.sqrt(252)  
    # Return estimate of annual volatility  
    return 100*historical_vol_annually

Brian - you still on here? - blast from the past! I was about to start coding our old vol control stuff also :-)

Hi. I am not getting a result.

I am looking at rebalancing every month(once a month). The objective is to know asset when to enter(buy) or exit(sell) or switch asset/re-strategize thought my data is based on twitter sentiment data.