Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
JAVolS: Just Another Volatility Strategy...

Yes I love volatility strategies as most people know but mostly they are very volatile (duh) and the drawdown is just too much. This strategy is not optimised yet and not hedged but it had over the last 18 months only a DD of 7% and a return of 60% plus. The catch... Hard to test the strategy as the volatility ETFs are not so old...

But I do think we are onto something and I invite you to help to improve

Cheers, Peter

125 responses

different average mean window, lower beta, higher alpha, higher return

Nice! Looking forward to playing around with it.

@ Peter is there a reason as to why you don't implement a trailing stop ?

As volatility strategies can be very volatile intraday and it relies on data of fetcher which is not intraday I'm not sure that is very reliable. But it could be implemented on the SPY leg of the trade I guess. This lag will change when we can use futures to determine the factors (@Q people: when is it coming? tap tap tap)

In general I don't like trailing stops but that is a personal conviction. I don't like them for the following reasons: Trailing stops make your algo complex as after the trailing stop is triggered you have to ask yourself: when do I get in again? The trailing stop starts to act as a signal outside of the scope of signals you have designed with an economic/behavioural/other reasoning. The other reason is that I'm considered Australian for resident purposes and therefor I'm forced to use a non margin account with T+3 lag, therefor I don't want to be in a situation that I sold and and have to buy soon thereafter. But I'm happy to be proven wrong if anyone can design a trailing stop for this algo that behaves properly and doesn't muddle with the signals.

If you want to improve returns (and DD) on this algo, just change the 0.49 to 1.0 which will lever the algo to IB limits (if yo can trade margin accounts). I tried to use SPXL to increase returns but that strangely doesn't work but I haven't tried UPRO of levered VIX ETNs. Plenty to play with!

Disclosure: I'm live trading this now and so far its behaving as expected

Fair enough I guess, I'm trying to implement a trailing stop in your also, I'll post it back here when I finished so we can discuss the results :)

I'm paper trading with one of my algos and last week it was up 22,1% but now it's down -12,3% (probably to do something with Brexit?). This is what makes me interested in trailing stops, could also be that this is just one of those dips you need to deal with. Your algo seems to handle it just fine though !

Yup, my vol strategy is in an unprecedented drawdown right now, due to high VIX in the absence of any symptoms in the underlying equity markets. Exciting times.

EDIT: I should add that with mine, I tried trailing stops, but found they didn't really add anything, and they are super tedious to keep track of over splits and in the presence of partial fills. It was by far the most bug-prone algo coding I've ever done, so I abandoned it.

Thanks for sharing. First time I'm looking at a vol trading strategy so thx for the commenting and linking.

Checking from Jan 2011 when VXST started until Jan 2015 gives a less optimistic curve. Are the variables in the algo optimised for performance during period in the original backtest or how did you set those values?

the only determined number is the vix as the vix is an expression of vola and never becomes zero and never becomes extremely high and will revert to the mean. I haven't looked optimised the timescale yet, but the rest of parameters are based on rolling median, or slope crosses.

A while back, there were several posts talking about the "forward" looking bias introduced by using "data of fetcher"
hope this is solved here.

thank you Peter Bakker; I have few questions on the signals:

My understanding is that there are two major signals: the trading of SPY/EDV and the trading of XIV/VXX
for the trading SPY/EDV, signal occurs when median(VIX/XVST) is lower than the average of the 90 days median;
for XIV/VXX, signal of longing XIV occurs when v1/v2 < 0.9 or VIX > 27;

questions: what is the purpose of having context.wait_trigger? signal of long SPY when longer term VIX is lower than historical medians may cause huge draw down (i.e., 2011.08); signal of long XIV seems to have the same spirit: VIX > 27, that is, Short VIX when market is risky. Sure, it will be highly rewarded when market quiets down.


if context.wait_trigger and last_ratio < context.threshold:  
    return  
else:  
    context.wait_trigger = False  

#XIV and VXX thresholds  
if slope12 <= 0:  
    rebalance(context.sidsLongVol, data, 0.49)  
    rebalance(context.sidsShortVol, data, 0)  
    context.wait_trigger = False  
elif last_ratio < context.threshold or vix > 27:  
    rebalance(context.sidsLongVol, data, 0)  
    rebalance(context.sidsShortVol, data, 0.49)  
    context.wait_trigger = True  
else:  
   rebalance(context.sidsLongVol, data, 0.0)  
   rebalance(context.sidsShortVol, data, 0.0)  

Both signals are VIX term structure based but different parts of the VIX: the vts = (vix/vxst) is the 30/9 day vix slope, the vx1/vx2 is the first vs second month. The algo get a signal from the Vix, then trades either (SPY or EDV) AND ( trades VXX,XIV or nothing )

wait trigger is something I do more often: I find a signal and wait one day before trading on it so you could have a "are you sure" routine or a "Oh Sh1t"-handle. In this case the reason is different: as you have found this algo is trying to find the points that the chances are high that the market turns. A high Vix: will go down eventually but the structure needs to show that it goes back to contango and does not increase backwardation (taking the VIX as the first reference point), as long as the VIX increases the algo does nothing, the moment it turns, it triggers.

This algo suffers from a lot of problems (timing of the algo is one, let it run at different times of the day and you will see why, using fetchers is another one: it could have forward looking bias as the fetcher and the delayed market-data could interfere with proper signals and it has 2 hard coded numbers [yeck] )

Unless you are very familiar with the VIX: dont let it go close to real money. I'm just publishing this as a v1 version of an algo I trade (I trade v6) and I hope people take this one apart and find a better signal to trade the spy/edv based on the VIX. Simply said: I believe that there is a signal in the VIX that allows one to trade other instruments, not necessarily the VIX derivates themselves as they are too volitile for sane people: I might be wrong though

Don't know if you are aware of this but there are issues with some of these ETNs tracking the underlying even intraday:

http://epchan.blogspot.com/2016/06/some-things-you-dont-want-to-know-about.html

It was maybe 6 months to a year ago that DWTI went down ~20% and UWTI went up ~10%. That totally was lost value for anyone longing UWTI. DWTI tracked correctly, UWTI did not. it never adjusted.

We need historical indexes. Backtesting the VIX is a challenge because the VIX by definition is already delayed, so if you get the VIX at a point in time you are actually seeing it like 15 minutes ahead of when you should. This is would also show when doing live paper trading because the quotes are 15 minutes delayed on that. Seems like you have to just take the last-value from the previous day and rely on UVXY to estimate the actual VIX movement intra-day.

Is there a reason why I cannot run a full backtest? I am new to Quantopian. Thanks.

Hey Eric, I am having the same problem. New to Quantopian/Python as well. Any luck ?

Thanks,
Tim

what error do you get?

Error msg: "There was a problem starting your backtest. Please try again." I can build the code to see the results but I cannot run the full backtest.

I cannot run it for a full backtest either. Guess it's a temporary Q server bug.

Hi guys, sorry for the inconvenience. Some other users have run into this issue as well. The use of the zipline.utils.tradingcalendar module causes an error in full backtests, but IDE backtests work fine. We will have this fixed soon. Thanks for your patience in the meantime.

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

Cleaned up the way Pipeline was being used by importing VIX and VXV from the CBOE datasets. Also reduced the amount of Fetcher calls as in closer inspection, the discrepancies between Fetcher & Pipeline occurred in 2014 but stopped towards 2015 ~ 2016

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

would be awesome if we could get vx1.vx2 and vxst through the pipe as well....

I love the use of EDV to short SPY. I didn't realize there was an ETF that goes -0.97 beta and also has dividends. Seems like that is highly preferable to actually shorting the market with an inverse ETF. Now if there was -2.0 beta that also paid dividends...

This strategy does well in recent history but poorly if you start at 2013 instead of 2015. In contrast, my incomplete VIX bot, which only shorts the VIX for now, gets 170% over that same timeframe. From 2015 onward to yesterday, this bot does better and also has an impressive lack of drawdown, although examining the buy/sell actions, I find it mostly goes SPY or EDV and seems to day trade XIV a lot with there just being a position that it held for a day and then lost money.

Also, I will point out, sending 100% of the money you were going to use to short the vix into XIV the moment the VIX hits 27 or higher is a bad idea. In a real crash, the VIX will likely go beyond 27 for many many days or weeks on end, with XIV getting depleted to just a fraction of its original value in that timeframe, assuming it doesn't hit their 90% in a day threshold where they will disband the ETF. Inverse ETFs are dangerous because of this.

Realize that crap ETFs like UVXY do actually have a circumstance where it makes obscene amounts of money, but with regulators ready to step in the moment the market starts to weaken seems like we may not see that anytime soon.

I would like to see someone figure out a strategy that examines the different VIX etfs out there and profits off of the lag and tracking errors some of them have. We have SVXY/XIV, UVXY/TVIX, VXX/VIXY but also VMIN and VMAX (approx 1.2x leverage, but don't market buy early morning) and then the hand cranked VXUP and VXDN, which, after seeing how badly UVXY was tracking and how quickly it lost value after the brexit, actually seem even more appealing as a way of longing the VIX.

Tx, James, food for thought. Why don't you share your incomplete VIX bot for us to play with?

BTW: Hitting the 27 is dangerous indeed. I want to make it better by making that the trigger to wait till one sees the real vola abating. Something like, if you hit hit the 27 wait till the volume of spy starts decreasing and the price stabilises: then buy XIV. Something like that. I just need the futures in minute mode....

As a professional programmer, my VIX Bot looks like a turd since I've been using the IDE and doing a lot of experimental changes. My equations make no sense and are arbitrary. I took a neural network approach to this, which means I created some signals based on UVXY and SPY's price, then just guess and check tweaked them until they looked the way I wanted them to, then guess and checked logic based on the signals to areas where I wanted it to short or avoid shorting the VIX.

And really, if you wanted to make a VIX algorithm that makes a killing, you would just short sell UVXY all day long. Like seriously, if the S&P 500 is "too big to fail" then that means UVXY is doomed to never be a profitable long.

This algorithm has been updated to support the upcoming changes found here: https://www.quantopian.com/posts/upcoming-changes-to-quandl-datasets-in-pipeline-vix-vxv-etc-dot

Please view the 'Migrated Source Code' section and update your algorithm with the code found there.

Peter, I deployed this to my paper trading account on IB, and it has yet to make any transactions. Have you seen similar behavior? Looking at the backtests, it was transacting daily.

I found the same and that is because fetcher does not have the same behaviour in Live as in backtest (=bug) I rewrote part of my algo to take on just the Pipeline vix and vxv data. The vx1 and vx2 signals you have to drop or calculate based on the real volatility. Hope that helps

I have posted a workaround to the fetcher backtest/live bug, so that it works the same in both. I use fetcher vix/vxv in live trading.

@Simon - Is the work-around updated in your code above?

It's not in this thread, it's in one called "history for fetch_csv" or something like that. I've referred to it anbout once a month for a year. :)

Peter, which one did you update? First post or the reply? Also does that tie in with the migrated source from Seong? Little confused – going to dig into it a little further as soon as I can make the time.

Would be great to see a diff on the platform, a la github. As it stands it's difficult to see a) what actual lines of code changed from the original source to the migrated version and b) if an author updates a post, which lines were changed.

Sorry Ian, I didn't change it in this thread yet, something on my todo. Best if someone can get Simons work around in the algo and repost it

Gotcha – I'll have a look into it. Thanks!

Hi Ian,

The migrated source that I've posted are not related to work that Peter has done. The migration I posted reflects an API change we made recently in accessing volatility & macroeconomic data sources through Pipeline

Thanks Seong, I thought Peter had updated his algorithm, and I wasn't sure if it included the API updates or not. Still[feature request], it would be nice to see a diff between the original and migrated source.

Hi all, I am live trading with this algorithm at the moment with a capital base < $10,000. I have been running it for around a week and a half now and so far no trades have been made. Is this fairly normal? Thanks in advance for any help!

Daniel, as Simon said above there is a workaround for using fetch_csv in live trading, so it is noraml for your algo to not make any trades since the values for VIX futures come back NaN perhaps.
I really highly advise against live trading this algo unless you fully understand what the logic is behind it. It is a pretty decent strategy but also a dangerous and volatile one. Additionally, this algo as is, needs some work to be ready for live trading. Anyhow, as others had said, proceed with caution.
The workaround Simon mentioned, I believe is here

@ Peter thanks for sharing this. the backtest suggests your algo generate 30+% annual return w/ 10% max dd between 2012 - today. The 3:1 ratio is at very nice VaR level.

For fetch_csv and live trading, the official doc says: 'When Fetcher is used in Live Trading or Paper Trading, the fetch_csv() command is invoked once per trading day, when the algorithm warms up, before market open.'

Can anyone help me understand exactly what these csv files are doing? I understand that they are a history of something, but of what?

vx1Url = 'http://www.quandl.com/api/v1/datasets/CHRIS/CBOE_VX1.csv'  
vx2Url = 'http://www.quandl.com/api/v1/datasets/CHRIS/CBOE_VX2.csv'

Since the URL appears to have a person's name in them (CHRIS), and I'm not clear on what they are or how they get updated, should anyone be hesitant to rely on them?

history price, vol, open interest about CBOE VIX Futures Futures #1 (VX1) and Futures #2 (VX2)

@joseph: these files are "direct" from the CBOE via the Quandle platform. You can also get them through the CBOE site directly. The prefernce would be t include these in the native datafeed. @SeongLee can maybe tell you if or when this will happen. I have rewritten the algo so that they don't need those files. Its possible to do but it needs some creativity in estimating the long and short vol based on the price and vol of the SPY. Alternatively you can mimic it with VXX and VXV

Anyone updated this to trade live?

checked the dataset @ quandle, the most recent date is 8/8/2016. would anyone suggest where could we can the latest data e.g. on 8/10 the algo get 8/9 end of date data ? thanks!

Live Trading with Robinhood
Solved: fetcher fixed, leverage made compatible for robinhood.
Issues: Low performance as initial capital increases.

Code could be cleaned up and made more pythonic.
Room for improvement in performance

Peter, thanks a million for sharing this trading strategy as a stepping stone!

Have you tried the VIX/VX45 for the implied volatility term strucutre? VX45 being the weighted 2nd and 3rd futures. If you have, would you be able to shed a little light on (in words, not expecting code) how one could create the VX45 from the 2nd and 3rd future? I'm not sure I fully understand the weighting scheme.

Hi Samuel,

Thanks a lot for fixing the live trading. I have a QQ. In shift_dataVX method, why did you do "add_last_bar" twice?

Thanks,

@Will Lu
VX1 and VX2 contract data sets are updated at ~11am whereas the algorithm run at ~9:30am.
Therefore on the 12th the data available would be the 10th until 11am where 11th data would be made available.

I chose to keep the algo set for market open rather than change to 11am as overall performance is better.

@Samuel Oyebefun
Ah, I see. That makes sense! Thanks a lot!!

@Andrew Martin
The F1/F2 weighting should give a 30D constant maturity calculation of the VIX futures.
F1 weighting = #days to expiry divided by total #days in F1 position (=20 for Aug)
Note this is trading days, and needs to account for public holidays.
You could easily extend this for F2/F3 to get 60D CM (not sure why you're interested in 45D in particular?)

In any case, not sure how useful it will be, as there wouldn't be many cases where there is a healthy contango for 30D CM whilst 60D CM drops below that - in which case you're risking over-parametrising...

Tom,

Peter put a link in his source code and I did some reading on their strategies (www.godotfinance.com). Their most up to date strategy has the most success using the VIX:45D signal out of many tested (shorter term vol / longer term vol) signals. I 100% agree though that this is likely just over fitting historic data.

I'm confused why you are comparing slope12 = (vx2/vx1) <= 0. This value can never be negative unless vx1 or vx2 where negative. That's not possible is it?

good catch, is a relic from my code that calculated the ratio and transformed the result with *100-100 . In the current version the logic value would be 1 where we test for contango/backwardation. I have take the last code of @Samual and replaced it with a 1 .... doesn't make a difference

Luke, I don't see slope12 going negative. Yes, slope12*100 -100 goes negative (when slope12 < 1)... but that isn't used in the logic. It's only used for recording. Things could definitely be cleaned up. You could write out this code in 40% of the lines; so go for a clean up, it's a good way to learn the code. But like Peter said, this is his v1 and he trades v6. You can tell there is functionality he was obviously curious about but maybe abandoned or is using in later versions. Maybe share the 'fixed' logic before chalking it up to luck?

Made a few logic changes regarding the booleans which has resulted in improved performance and metrics.
Cleaned up the code slightly, will continue working on improvements as I feel the strategy has great potential.
Next objective would be to implement a downside protection model to reduce the max drawdown.
Source: A Simple Downside Protection Model
@Peter I would also like to implement the logic in addition with a value-momentum strategy that longs 25/S&P500 stocks monthly. This in theory will produce a high alpha (volatility strategy) and increase diversification (hedge: value-momentum strategy) .

@Luke Can you post the backtest of "When you take away that bug (feature?) it does not function as well"

I have noticed what you described, I think the issue lies in the availability of funds from sale. The strategy sells all other stock then attempts to purchase the favored stock. Those funds from the sale are available sometime during the day when the sale is filled, impossible to estimate when. Therefore to use those fund the logic would have to change entirely, it would have to close positions at market open and use some test to determine when those funds are available before entering new positions during that day.
What I described might resolve the anomaly you mentioned but may inadvertently negatively affect performance.
The anomaly could also possibly be fixed with the use of a leveraged account, keep in mind that robinhood isn't leveraged.
Feel free to test and post results and feedback.

@Samuel, or @Peter, what are your thoughts on what this algo would do during a downturn (something on the order of an '87, or a 2001, or a 2008)?

Adam,

It did good in Aug 2015 mini-crash but did not respond well to Brexit on June 23rd 2016. Also did ok during 2011 panic. I think it could go either way currently because there's no protections against huge drawdowns and nothing stopping it from being in XIV during a spike in volitility.

Methods I've tried to lower drawdown:
- Kelly Criterion on account level and individual stocks: Was able to decrease drawdown to ~15-20% but took away 2/3rds of returns
- Stop loss: This helps a little but the main drawdowns are occuring over multiple trades so it doesn't do much. I would recommend using wide stops as a circut breaker anyways on this just because you're trading volitility.
- Trailing stop: horrible performance

The key I think will be to dynamically adjust the money you are risking based on performance and swap the excess into another strategy with almost no drawdown/volitility. That's what I'm working on at the moment. I'm a really bad programmer though...

Thanks for the info Luke. Super interesting.

No doubt that research in these ratios is extremely compelling. I currently utilize the VIX/VXV ratio but instead use a 300SMA instead of the hard numbers obtained from the previous released research. Has anyone utilized a SMA rule instead of the proposed .95/1.02? Thanks for the response, I'm also a part-time home trader so excuse my lack clarity.

I have tried that, but I found the theoretical justification lacking. The argument for 0.95, for instance, is that is the minimum VIX contango you are willing to accept in return for the jump risk. Since the (futures!) contract contango has an actual mechanical effect on the return of the ETPs, it's an absolute amount, not a relative one that you care about.

Mine got crushed during Brexit too, mostly because the volatility complex spiked but there was no measurable impact on the actual equities indices.

@Peter thanks for sharing just a solid algorithm. I improved the date shifting, it is much more simple, and more robust by shifting the date by time. eg. df.tshift(1, freq='b'). The original shift shifts the data back 1 , and losing the last tick. This one add 1 more date row, and shifts data forward. The VX1/VX2 still are behind 1 day, I'll see if it works tomorrow on paper trading.

tshift will fail for holidays?

(In live trading)

@simon, good point . fixing.. I found trading_day freq in https://github.com/quantopian/zipline/blob/master/zipline/utils/tradingcalendar.py

from zipline.utils.tradingcalendar import trading_day  
df = df.tshift(1, freq=trading_day)  

@Simon first time i'm going for live trading. This algo has at least 5 days warm up for IVTS, and another 22 days to get a signal, are you getting around it in anyway?

I use the trading calendar yes.

@Peter, Thanks for providing yet another high quality algo.
@Samuel , I run both your latest algo and Peter's original one start from 2016, the transactions are totally different, could you please explain why?

The historical data download happens only once during initialize. Not before trading start everyday like I would have expected. Is this a mistake or am I missing something?

In backtest, it's once..but in live trading it's on every trading day (see docs). In live trading, you won't see anything in the first few days because the signals rely on pushing into a list for at least 5, 22. I had to find a way to not rely on previous tick to build current signals, by loading the entire historical data, and calculate when needed. I had to take out the pipeline and use load_csv hack to maintain historical data.

Quoc Le, did you ever get the holiday problem fixed or Do I just need to change lines 171/177 to this:

df = df.tshift(1, freq=trading_day)  

@Luke I am using that. yes.

Good discussion guys and great returns. However, I've always found in my own development of volatility strategies that Black Swan events kill them. Quicker than you will ever be able to react to them. If you're going to trade them for real then skim your excess returns and don't compound. If you can get 30% non compounded returns then that 60% drawdown that happens once every 10-15 years is more than made up for by the returns in between.

If anyone wants to have a go at backtesting this one before XIV and VXX existed then here's a good source for reconstructing their values. It's most likely accurate(can't be proven of course but no reason to think otherwise as the calculation is mechanical). If you're too lazy to run the code download the values here: https://blog.thinknewfound.com/2014/02/reconstructing-volatility-based-etps-and-some-observations/

Keep up the good work

Hi all,

I am a volatility trader (live) and just found this website and thread. I am looking to make my trading automatic and avoid consulting the markets everiday as well as contribute with my experience for this discussion.

I consider this algoritm very promising and maybe I can add some value to the discussion. I am not a programmer (although some basics) but willing to learn. In my manual trading I use three basic signals that define how I position myself (long XIV / Long VXX). Although I tend to be less confortable when Long VXX...

  1. Ratio VIX / VXV: Long XIV when ration>1,06; LONG VXX when ratio< 0,99
  2. Contango Oscilator (VIX-45+1000*(VX2/VX1-1)): Long XIV when CO > 50; Long VXX when CO < 20 (sometimes I do not enter Long VXX if VIX is high >20)
  3. VIX momentum: When VIX spikes above 6% a day, close any long XIV positions

Maybe point 3. of my manual strategy could be included in order to reduce drawdown of this algo.
Currently I am long XIV. I think volatility traders should be open to higher drawdowns than normal, due to the nature of this style of trading. Nevertheless, in the long run, contango is our friend and since, more than 85% of the time VIX futures are in this state, it is expected to recover the big losses. The 1 Million dolar question is how to maintain profitability and reduce drawdowns. My suggestion is to have a balanced portfolio approach, using both SPY and TLT in some indefinite zone of trading signals, instead of have 100% volatility approach. I am reasearching this at the moment, but still no answers. This would compromise a bit profitability for an expectable drawdown reduction. A trade-off depending on trading style.

Any comments apreciate.

I run a live model, but based on slightly different contango/vxv approaches and realised vol EMA.

With respect to other portfolio elements, I found (as a crude result) that being 3x short S&P futures when VIX 20 much worse than 13-->20).
2) incorporates mean reversion; although there are some long bull runs, historically such low levels tend to revert in the short term, and 3x allows you to capture the gains and the interim roll yield.

If you are able to short 3x S&P bull etfs, that will be better still due to the volatility drag - in the UK I can do this cheap with futures (but 100% margin), but doesn't seem as easy to do in the US...

I've looked at TLT as a general holding (rather than with specific trigger points) and dos not find it to be profitable.

I found this article an interesting read, as does touch on TLT amongst other hedging alternatives.
https://www.caia.org/sites/default/files/2013-aiar-q1-comparison.pdf

Correction (posting off an iPhone with bad signal):

Should read:
With respect to other portfolio elements, I found (as a crude result) that being 3x short S&P futures when VIX 20).

Ok, some issues posting here...

Should read:
With respect to other portfolio elements, I found (as a crude result) that being 3x short S&P futures when VIX<12.8, as:
Reduces risk of drawdown, as going 11 to 20 much worse than 13 to 20

Indeed. Thanks for sharing the research.
I think we could add to current algo is some protection regarding VIX increase, if there's any Long XIV position opened (or Long SPY). That could be either a big spike in VIX (I use 5%-6%) or a SMA signal as a smoother option in order to sell that open position. I am a manual trader, so I tend to have a bit of emotion in my trading and do not stick to exact rules. That's why I am trying to find a way to automate.

Since I am new into this, can you tell me if this website (and the script) could be tested in a demo account but with live/delayed prices?

Trying to understand the algo in more detail, I found in line 126:

XIV and VXX thresholds

if slope9 <= 0:  
    rebalance(context.sidsLongVol, data, 0.49) # VXX  
    rebalance(context.sidsShortVol, data, 0)  #VIX  
    context.wait_trigger = False  
    context.buy_V = 1

My question is: Slope9 never goes 0 or below 0; it is defined as VIX/VXST ratio!
Shouldn't be 1 instead of 0?

Hi guys: great discussions. Looking quickly at the model of PeterSalda and indeed his rule 3 could be a good addition.

Some people asked me how I have adjusted the model to have less risk: I never have more than 33% of the portfolio value in XIV or VXX and when it has no XIV or VXX, I just put 33% in SPLV. So with only 33% exposure I beat the market...Now I just need bigger balls and put a bigger number into the 33% parameter. For me this algo is to "harvest" the extremes without taking too much risk... so far so good. looking forward to improvements you guys cook up!

It's a shame that they don't allow CSV values in the contest.

This attached analysis is on the version of this algo that i'm just started live trading. I've only added risk management and a complimentary strategy that kicks in while the volatility strategy is doing nothing (which is most of the time). I didn't change any of the actual signal logic. I tried Peter Salada's 3 techniques and none of them were profitable by themselves. Even combining them with the original strategy on a voting system only reduced profits or increased draw down.

The thing about this that bugs me is that I think these volatility strategies are done. The market has compensated for them already. Something that I've been noticing in the markets recently is that they seem to be working backwards, rising on bad news and vice versa, VIX numbers that don't make any sense. It makes sense when I see the returns from all these volatility strategies, most of the gains come from pre-2013. Recent Big events such as Brexit or Aug 2015 are almost always missed by these algos because of something or someone who's manipulating the numbers so they can't see it coming, leading to huge drawdowns.

Regarding Lazar post, I must say that I continue to believe in volatility based strategies (especially in the long run) due to contago state in majority of time. Volatility (VIX) is also a mean reversion in price behaviour. I feel confident in shorting volatility although with risks of annusual drawdowns. That's why we are here. To discuss the best way to allocate cash in order to get the maximum from the market; how to reduce drawdowns and volatility of the assets and be more profitable that a certain index! All contributions are valid, even though there could be a little frustration if things were not exactly as planned initially.
If it was easy ...

In order to improve the algo here are my comments:
line 38: context.hedge = symbol('IWM') # RUSSELL INDEX - probably this came from previous iteration but is not used, as far as I understood;
line 84: vxv = context.vixpipe.loc[symbol('VXX')]['vxv'] Although it extracts and computes the VXV quotes, it is not used;
line 89: slope12 = (vx2/vx1) is also computed but never used;
line 127: if slope9 <= 0; a comment in previous post but if it is sbistituted by a number > 0.5 does not affect results;

Comments apreciated

Can someone pls. explain to me this strategy or\and send me some paper regarding this one?

Could someone please explain what happened the week of August 24? There appears to be huge profits this week.

@LeCoque: how is your version trading?

Change of threshold to 0.80 and set start date to 2014-01-01

i cant get how slope12 = (vx2/vx1) can be negative or 0? Shall the algo go long volatility if vx2 > vx1 or ratio >1? In current state, the algo never goes long vol.

 #XIV and VXX thresholds  
    if slope12 <= 0:  
        rebalance(context.sidsLongVol, data, 0.49)  
        rebalance(context.sidsShortVol, data, 0)  
        context.wait_trigger = False  

I have mine set to vix/vx1 > 1.10

If i change the settings to long VXX if vx2/vx1 < 0.9, i.e. slope12 < 0.9. It decreased Sharpe from 1.51 to 1.29...

More interesting, if i change the settings just to close XIV position without going long VXX --> Sharpe increases to 1.73 ! (see attached backtest)

I was not satisfied with fixed values for VIX in above strategies so I wanted to do something different. VIX can reach higher values in stronger downtrend formations. I used different approach, which works fine for the last four years, but at the end it is still not good for downward trends so don’t use algo for live trading, but It could be some support for manual trading.

So, my approach is not very quantitative, I wanted to understand behavior of the market through volatility and to give some ideas.

I tried to identify entry and exit signals for XIV etf. I don’t see any point to use SPY ETF (XIV has beta 2) instead, unless if you want to lower volatility. I was looking for spikes in different oscillators. I found out that ivts (vix/vxst) is the most appropriate (ratio between short term and long term fear). The problem of slope12 (vix2/vix1) is in singularities near expiration of futures, contango oscillator has the same problem. But we should know that vix1 and vix2 are traded futures and are included in XIV ETF so they have their importance. So somehow we should integrate calculated and traded volatility to get better signals (idea for further development).

The algo still has quite big downwards because it is difficult to find proper exit or entry signals, sometimes we enter too soon, sometimes we don’t exit when we should. When we have market correction in more than one steps then we enter too soon, because ivts reaches minimum at first step already. Signals for exit strategies are not very clear but I believe that with further development we can improve that. We cannot use the same logic for going short on vix and going long on vix. Correction are quick so we shouldn’t be late when we enter and exit from long vix position (we cannot wait for spike, instead we use fixed values which I don’t like).

Instead of VXX ETF (long vix) I used VIXY ETF which is short term long vix ETF. The problem of VXX are already mentioned singularities at expirations of futures which could have quite a negative influence on ETF price but these singularities helps XIV ETF to do better especially in the phase of slow growth of the SPY.

Hi Borut,

Thanks for the algorithm! I'm new to volatility trading and just tried live paper trading your code and am getting errors with NaN numbers on ln 130.

Hi Borut,

Your returns seems very good at the first glance. But the Beta value is very high (> 1.0). This means it depnds the market trend very heavy. See my backtesting from 2008-01-01 to end 2012.

And one can also see, from the time 2008-01-01 to 2011-03-20 there is no any trade.

I would like to say, it seems no body has done a back testing from a time range covers the year of 2008 where there was a big crash bcz. of Lehmann. If you do a back testing covers this time period you will find the picture of returns will look quite different.

Thomas

@Chang: Maybe because these instruments were not available? Or you think we are all dumb as cheese?

If you look at your own graph at itvs and read the code you can actually see that this strategy would have made a boatload during that event, that event was special, but that does not dismiss this type of strategy! Try to add value to threads instead of making empty statements, appreciated!

PB

@Peter,

The code I use/copy is from Borut Janša. I haven't change any code. But as I see the Beta > 1, I doubt it will do good during bear period. So I set the backtesting period covers the year of 2008 and what I get is what you see above.

I don't mind your algo is not good. I just want to mention if one put this to live, one has to test it both in good years and in bad years. Am I right?

@Thomas

Dude, the VXST is not available before 1/3/2011. Take a look at your data.

@Thomas
As mentioned in my post, algo is not good for auto trading. I am aware of bearish market issues, but with some development (example oscillators on spy index) I believe it is possible to improve it to avoid drawdown in 2011 or in similar situations. You should understand the algo first and then you will understand the meaning of beta > 1. XIV itself which is mainly used has beta 2 for example so more than algo.

@hur hur
I copied code from original algo so you will most probably find the solution for your problem in previous posts. I haven’t tested it for live trading.

@Peter Bekker
Thank you very much for share such a nice algo. I have one question, in the migrated code, line 99
context.ivts_medianfiltered.append(ivts)

I do not understand why append ivts, should it be :
context.ivts_medianfiltered.append(ivts_medianfiltered)

I assume context.ivts_medianfiltered is the list of median filtered ivts values.

I changed VXST (9 day vol) to VXV (93 day vol) in Samuel Oyebefun's algo from Aug 21, 2016, and it produced again the outstanding returns for the last 5 years but in draw-downs for the last few months.
Mine concern is that the algo relies heavily on bonds as a safe haven asset. The historical correlations might have changed recently. Any idea, how to modify the algo , to chose a safe haven assets based on highest negative correlations in recent history? For example, the algo would check among gold, oil, treasuries and chose the one with the lowest correlation and best returns.

During last 5 years, the most of the returns generated by the algo came from EDVs. The treasuries vs stocks correlations might change due to tightening cycle in the US economy.
Just FYI, attached is the same algo as above , but instead of using EDV it goes cash.

that would be a good build, to have a dynamic hedge based on the lowest correlation

@Dawei: good catch. the median scores better (as expected)

attached the itvs

attached the ivts_medianfiltered, a bit better

http://www.vixstrategies.com/strategy/

Check out this VIX strategy

Thank you for your share!But I have a few questions on the signals:
What does the index ivts stand for?And why did you choose the 22-day mean of ivts+0.01 and the 22-day mean of ivts-0.01 as the upper and lower bound?

@ Peter

Firstly many thanks for sharing this, very interesting and educational! I just ran this over the last 7 months and the latest implementation is ~ 15% down while the original is 50% up. Any insights on why ?

I have abandoned any live trading with Fetcher as somehow the real world did not match up with how the fetcher behaved, primarely because of timing issues: you are not only depended on Fetcher and when it pulls the data but also when the provider updates the data. So I discarded all these fetcher algo's and wait till Q has Futures, as those will solve fro most of that.

Anyway: I dont know why those 2 versions are so far apart... Looking at the code I still like the version from Jun 9, 2016 most as I know there is no meddling and running it now would get you out of period result

@Peter Bakker

I encountered similar issues like yours when running live. The Quandl WIKI feed was always 2 days behind. I ended up building my own data processing outside of Quantopian, and automatically upload to dropbox every night. I wrote the process to build synthetic VIX30 as described by Chrilly Donninger (HeroRats) with the CBOE data. Also I found out that the CBOE file was always 1 day behind, so I extracted the most current VIX specific contract settlement, and fill in as part of the computation. The last trick was to pack & unpack the entire price history due to Quantopian limitation on the fetcher. All the work paid off, the algorithm is doing really well. I'm happy share additional info.

@Quoc Le: great effort and I would love to know more details as i know I would have plenty of algo idea if this would be solved:I was just waiting for the futures stack and trade now very shielded algo's (Barbell type of algo as described by Nassim Taleb in Antifragile).

@Quoc Le: Could share your formula for Chrilly's VX30? I tried to replicate it and the VX45, but they never came out right.

@Peter Bakker the performance of your last posted algorithm goes down wildly with a higher initial capital (try with $1M). After a quick reading of the code I was unable to determine why this would be... is this expected?

goes down wildly with a higher initial capital (try with $1M)

Price drift from higher volume of partial fills no doubt. Incrementing a counter each minute a stock has a partial fill (and for each stock), with $10K the count is around 586,000. At $1M that number is 2,469,639 so the prices had quite a bit of extra time to venture into unappealing territory. About 60% of those were on the buy side by the way. Try looking thru any open orders in handle_data to cancel at least any buy order where the price is too far for comfort from the price when ordered, and store those prices when placing the orders, in a context dictionary. When it comes to sell, might just have to let them run. In fact if a sell is not complete by end of day (eod), might want to renew it next morning. That eod thing happens once on sell with EDV at $1M, 4 times for buy.

@Ryan, Blue is correct. As these secs are low volume you suffer from slippage and n ot being abel to fill. Lot's of algorithms at Q have low capacity and that is fine, it means that they are not interesting for the Hedge fund... We can still make money with it ;)

Looking good. I am going to test it out a bit and see if I can make some useful changes. Let you know if I find something!

I have the following strategy live , Lisa

Is any work still going on with this?

Peter are you still trading this live? How has it performed?

it was a bit volatile last week but it came out slightly positive. still trading a variant of it (i never publish the trading version, always one of the previous versions). in the past 6 months it did about 25% according to the taxman, but that has some aussie dollar influence in it

The UVXY purchase from 8/11 didn't show up in the backtest and I've been tearing my hair out trying to figure out why. Any ideas Peter?

the backtest behaves a bit different than live because of the pre-filling in initialize. If you want to have the exact behaviour you have to run the backtest twice: once to generate the numbers, put in the numbers and then run it again. I have not coded a dynamix initlialisation of the variables as I dont need that ;)

BTW: I dont trade a version with UVXY, I have replaced it with IEF as UVXY needs really good timing and was responsible for all drawdowns...

That worked perfectly, now the backtest matches! Thank you! I'll certainly check out IEF soon.

@Quoc Le how do you calculate the VX1 and VX2 due to delay of quandl? do you pull data from CBOE and then upload to dropbox?

I scraped CBOE site for the latest price, then merge back with the historical prices, then do the VX30, VX 45 weighted calculations.