Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
The Gold King And His Knights

There are a multitude of tweaks to the "Bottom Feeder Bandit". Now this is dubbed, "The Gold King And His Knights".
It seems to do extraordinarily well except when the stock market keeps hitting all-time record highs. I'm guessing here. I didn't "check my facts". This one shows great potential though.

198 responses

Found some major bugs in my logic. More thorough backtesting to follow. But the fixes are already in here.

Charles,

I have not looked at the logic of your algo or assessed its returns, but I am a big fan of the naming conventions. Every algo deserves a good title....

Ok, so here is a backtest similar time span as the original. I gotta research what is fundamentally different starting Jan 2013. I suspect that gold does horrible after that point because of the stock market doing so well. It's also interesting that something about the mechanics of my original flawed logic was causing noticeably better results during the time before Jan 2013. Things that make you go hmmmm.

I have made alot of tweaks here. Big improvement I think. Still considering what to do about April 2013 forward.

Ok, a huge improvement here. Room for more improvement obviously.

Charles ,

If you replace GLD by TLT you will get The Bond King And His Knights.
Inspired by Anthony FJ Garner post.
Who is the King of Kings?

Nice Vladimir! I haven't checked yet. Is that all you did was swap the GLD for TLT? Wow, either way.

Ok, here is my latest version. Another huge improvement. I swapped the GLD for IAU IShares Gold. This one works even for small portfolios because of the smaller price and the mechanics of my algorithm.
Other improvements are:
Virtual trailing stop loss
Randomness in each rebalancing's stock picks (other than the gold one)
Adjusted the take profit target to 12.5%. Others which might work are 3% or 25%. So far I think the 12.5% may be about right.
Rebalancing happens only when triggered and never while at a loss
While at a loss, holdings are sold at close to cost basis, and rebalancing does not happen... no new purchases

I have some ideas for more tweaking. But technically this is my first successful algorithm to beat the S&P500 from 2010 forward. So I shared it.

Ok folks, this is a huge improvement. Check this out!
I made some more tweaks.
I changed the factor of the rebalance trigger.
I changed the conditions under which Wait is triggered.
This is my best one yet! Hopefully, the results are not somehow skewed.
I don't think that I am guilty yet of overfitting.

Charles
Would you indulge me and set out briefly the thinking behind your algo?

I am running the algo back to 2002 but I am not sure whether slippage, comm and the borrowing costs of shorting are accounting for. Also the handing back of dividends. I suspect not but perhaps certain defaults are automatically included in the Q code.

Hopeless:

Something went wrong. Sorry for the inconvenience. Try using the built-in debugger to analyze your code. If you would like help, send us an email.

One things for sure. It doesn't scale, yet, for larger portfolios. I'll work on that. I have an idea how to make it scale.

I just now added "do not sell until at least the next day" in order to not attempt pattern day trading transactions which otherwise causes excessive number of rejected sell attempts. The end result performance is very close to the same as before, but should have way fewer rejected orders.

I'm not an expert investor. So my ideas are perhaps naive, if not uninformed.

Gold tends to do opposite of market during downturns. So I have a relatively small weight of gold at all times to counter downturns. I noticed that gold also seems to do well at least some of the time that the market is going up.

I made the one best performing stock of my pipeline the "king" part of the gold king. The idea is the best performing stock may have momentum and may continue. I will, at some point, make this scale to portfolio size.

Under current configuration, there are at least 23 worst performing stocks that we attempt to invest in. The idea is that if we are invested in a large enough number of worst performing stocks, a profitable number of them will bounce back up at least temporarily, long enough to take profit. I will, at some point, make this scale to portfolio size.

When I sell a stock in my positions that is no longer in my favorite list, I sell only 20% of it per day. I noticed that often downturns are reversed within 2 to 3 days. So the 20% per day does not completely liquidate my positions for 5 days and allows for at least a partial recovery of the market for that stock.

Funny story! 8)
I was in the doctor's office with my wife discussing her chronic pain / trigeminal neuralgia. My subconscious must have been still working on the algorithm. Because, all of a sudden, I had an epiphany, "else stop wait" will make the algorithm behave like I was expecting it to. Sure enough, gigantic improvements. Then, just now, I decided to make an attempt at making the algorithm scale to whatever portfolio size you throw at it. Well, 100,000,000 keeps up with the benchmark, but not good enough obviously. However, the change in code to make it scale to portfolio size also bumped the speed and results even higher for the smaller portfolios. I suspect it does very well even for large portfolios, but I'm not sure yet at what size the results start dragging down to benchmark level. I'll keep working on it.

Charles,

I'm starting to dig into this code as I'm really curious, as I'm sure a lot of others are. Just a quick comment/question:

And anyone please correct me if I'm wrong... but the 'make_pipeline' only gets called once (in initialize). It would appear that you have variables within 'make_pipeline' that you intended to change before each trading every day. This is not the case, and as a result, 'HighVar', 'LowVar', 'TopVar', and 'BottomVar' are fixed at a static value, which I don't think you intended.

I don't yet know how this influences your intended strategy.

It's initialized each trading day

Filthy Robinhood peasant here: How does this algo perform going long only? The results so far look really good! Nice work Charles!

James - are you sure? This is where I lack an understanding with the pipeline functionality in general, but I believe I am right. If I hard code values for the 4 previously mentioned variables, I get the same stock picks and overall results as Charles original algo

Delman, this is a long only strategy as is.

I'm not sure. It would seem reasonable that it's only initialized once in a backtest.

Charles - I swapped IAU with SPY to just throw a wrench into the theory. Things didn't change much from the 'too good to be true' results of your last post. I'll continue to investigate when I find time, but I'm just your average hobbyist that can get by with python

Here's the long only version. I think the bug in his randomization is irrelevant other than it might explain why it is sensitive to the amount of money in his account.

Like seriously, this feels like the first episode of silicon valley when the guys at hoolie are looking at Richard's compression for the first time and saying "I don't think he realizes what he has".

What's crazy is that this still has room to squeeze in some other algorithm in the areas where it uses low leverage to get even more money.

Can someone from Q hide this thread from the public? If this isn't "give a contract to the man" material I don't know what is. This long only version seriously went a year and the max drawdown was 2%. It didn't go above that util halfway through my backtest. Charles deserves credit for what he's created.

Hi guys,
Thanks for the nice and interesting comments! Way back close to when I first started messing with quantopian algorithm stuff, I had one backtest that showed outlandishly awesome results similar to this. The difference was that, upon digging, I discovered that it was treating me as if I had an extremely leveraged margin account. That is not the case here. I've been through each line of code here. It feels real. Enough, that I put another couple grand of my own money on the live trade. Before it only had about 380 dollars. Only been live trading it, in effect for 2 days so far. The first day, I noticed alot of rejected pattern day trader sells, so I modified the code to avoid attempting those. yada yada yada ... Anyways, I hope this is real. It has a nice "I built it" feel to it! I have lost so much freakin sleep building this thing, I feel like a zombie half the time. It does feel surreal.

Whether or not the randomization attempt on my part is actually working, I'm not sure yet. My logic for trying it, is that I don't want my algorithm to get tunnel vision and look at the same group of stocks day after day after day. Seems like too much chance of disaster that way. This way, it hopefully recovers, after bad days or after some of my stocks go bankrupt or whatever.

So, a bit of a huge request here knowing how much work you've already put into it, but if you get time and don't feel like tweaking the 'meat' of the algorithm, is there any chance you might be able to comment line-by-line or logic by logic so that newer coders such as myself might be able to get a better grasp on what's going on with the algo and how it ties together?

With that in mind, if you feel like keeping the logic to yourself, that's totally reasonable as well. You've built an outstanding algo here.

This is so interesting. Was not expecting such a high Sortino and Sharpe ratio

The algorithm does need cleaning up and probably more commenting as well. If I don't have more itches to fix with this, then I will get around to clean up and better documentation. I'm losing my job in mid October. Hopefully, this will provide some real income.

Running a backtest starting in 2006, there is an insane jump starting around late 2008. Any idea why this might be? Also, I'm sorry to hear about you losing your job, and wish you luck in your next job search.

[Will attach a backtest once its done generating]

I don't mind, per se, sharing the algorithm. This feels like it could work for the little guy. Also, if the ideas get out there enough, hopefully it will be available for learning, or getting started. I mainly am trying to build a better one that works for my particular circumstances. I like how this algorithm tends to have at least some cash all the time. So if I need it, I can withdraw it without causing havoc with my portfolio. My goal is to have a large enough portfolio, that occasional withdrawal of cash for my needs will not effect it.

Also, I am an audiobook producer / narrator. It would be lovely to be able to spend my "work" time doing that instead of what is otherwise necessary to pay the bills. I enjoy audiobook creation, like it is art. I am way behind on my current audiobook contract. The author is being extremely patient with me. I feel intense motivation to create another source of income. This feels a bit surreal, but maybe it will work. My money is already in the live trade. I find myself having to resist the occasional urge to manually sell at a profit and wait and see if the algo does it for me.

My background and specialty is automation. That may have helped me with this, maybe.

The automation more than likely has a lot to do with the success of the algo if you've already got the experience of 'thinking' like a machine, so to speak.

An itch to scratch in Q's code would be to not include cash deposits as part of the "Returns" and "Dollar P/L". If you have auto deposits going into your brokerage account, they are getting included as if they are profits.

Charles,

Have you had a chance to compare the fill prices you received in live trading in the last two day against the fill prices reported by the backtester for the same trades? I would be really interested to know how that shook out.

Thanks

Frank

That would be useful information. I don't think it would be a clean comparison yet though. There was at least one glitch during that time where the live trading algo was delayed. I think sometime this next 7 days, that comparison should be able to be made cleanly. Maybe I'll be a few hundred dollars richer by that time as well. 8) The first day made 8% with 30 dollars profit off of about 377 starting cash. ...
Here's an interesting exercise in "what might have been"?
377 invested starting on the below date till 07/21/2016
07/01/2016 7.7%
06/01/2016 14.6%
05/01/2016 9.6%
04/01/2016 34.0%
03/01/2016 44.6%
02/01/2016 110.9%
01/01/2016 170.9%

Attached is the backtest I mentioned earlier. Everything looks ho-hum until 2008. What happened that put this algo into Wolf of Wall Street mode?

Also, after looking at some of the daily logs, and I think you mentioned this earlier, there is a TON of day trading going on. Lots of buy and then sell for a $0.001 gain.

Holy #?!# Batman ----- 93,705% return -----
I hope this is for real! I tried to make this algo compound returns quickly, but those results are, I don't know what to say?!.

I think that the algo 'might' be selling stock which it doesn't currently hold. Also, lots and lots of 1 penny day trading. I might try and go through line by line tomorrow evening and see if I can pick out where that's coming from, because after a quick glanceover, it looks like it should only be rebalancing on the first day of the week. With that being said, as it stands you reach the day-trade law 5 trade limit almost immediately within the first few trades.

Delman I think it has to do with some things that were uncommented out in the algo. Some orders are being placed at places it should not. I am going to be looking at this code more closely tomorrow too to see exactly whats going on. This is definitely interesting.

Seungmin, definitely let me know what you find. I'll be going through it as well. More eyes should help speed the tweaking process along significantly.

After you mentioned the day trade stuff, I looked in my live trade logs for yesterday. Sure enough, there were what appear to be "get out" sell orders that were being rejected as pattern day trade.

The only place in the code that I see that could be happening is possibly the "get out" code itself.
Here is the fix:
if context.NextDay:
order_target_percent(stock, 0, style=LimitOrder(Curr_P*GetOutFactor))

I added it to the code and am testing it now.

By the way, I like the effect this change will have. It gives at least a day for the stock to be in profit or at least not in loss. So "get out" now waits at least till the next day just like "take profit" does.

Let us know how the backtest goes and if this changes things significantly.

Unfortunately it is difficult to trust the backtest result when the universe selected by the algorithm is low volume stocks.

Please note that the base universe is created using:

AverageDollarVolume(...).percentile_between(LowVar, HighVar)  

but LowVar is always 1 and HighVar is always 10. This means the universe is using the lowest volume stocks available and those are the stocks that are less reliable to be backtested with Quantopian. It is very difficult to model the order impact on such low volume stocks. Simon Thornington taught me something once:

If you are seeing results that only work when trading very low volume
stocks at high frequencies, you need to be very careful that you are
not just trying to do mean-reversion on meaningless bid-ask bounce.
Unfortunately, there's no way to really test this with Quantopian, but
one of my main findings in the process of creating a quote-based
tick-frequency backtester (in Scala) was that most of my basket
mean-reversion profits were just untradeable bid-ask bounce. I
mentioned this once in
https://www.quantopian.com/posts/inaccurate-max-drawdowns-on-no-trade-days-slash-periods
with a handy diagram: http://imgur.com/1WPMwFD

To double check if the algorithm is affected by this kind of problem I ran a backtest with LowVar = 40 HighVar = 50, but you might try higher values.

Latest change cuts the total return from 93,705% to 33,104.9%. Still very very awesome and I believe it is probably avoiding day trades altogether.

Interesting thing happens June 17, 2007. The MaxPV spikes to 2, much higher than expected at that point. Looks like a historical or market pricing glitch. It caused longer than typical wait for the next rebalancing because the "get out" factor was much higher than typical. March 2009 is where that "bat out of hell" rocket climb begins that you mentioned earlier.
I like the 50 minute wait between possible actions. You end up with first action of day at 50 minutes after opening and 40 minutes before closing... when markets are calmer. That's my theory anyways.

Luca, thanks for the response and explanation! The results you posted look pretty bleak.

Luca, it's using limit orders so I'm not sure we can just discount low volume stocks. That said, taking a better look at the code it seems it would have defaulted to a moving windows using values of 1 and 10 and then looked at the bottom 10% by dollar volume. LowVal and HighVal affect the universe AND the moving window lengths. You changed the moving window lengths it used in addition to the universe it looked at.

James you are right, thanks for pointing it out. Here is a backtest where only the universe is changed and not the moving windows (the same backtest resulted in 288% returns with low volume universe).

Please note I am not saying the algorithm doesn't work in live trading (because I don't know), I am just saying you cannot trust the backtest results with low volume stocks, they are difficult to model.

Here's with the universe being the stocks in the 20 - 90% percentile by dollar volume. Moving window is still 1/10 like it would have been.

I just spot checked some of the stocks on my original backtest and it looks like it's definately a penny stock trader, but the volumes are high enough it could work on a small account. These stocks have more than a couple transactions per day. It does seem like this would be totally unsuitable for the fund though.

This one is kind of funny. It's using 10 - 30 percentile and is long only.

Clearly it got lucky. Max drawdown 25%

@ Luca, great catch in reviving Simon's insight. I remember reading that last year but it apparently didn't stick.

Do you think paper trading can lead to sobering results (on the low volume stocks), or would one still see meaningless gains? Obviously these low volume stocks are not of much interest to Q

I'm going to rework the algo a little bit tonight using the info that Simon gave us. Personally, I tend to trade (manually) relatively low volume and low price stocks, so if I can replicate that using this algo, there shouldn't be an issue. I have yet to run into the issue via manual trading where an order could not be filled.

Another issue I see is that the algo basically gives $100 per stock and as your account size increases it just adds to the number of stocks it holds until it's basically a broad market fund.

An optimal value would probably be based on daily volume, with it favoring quality over quantity

Ok guys, I made some significant changes in the contested pipeline code. I also made some significant tweaks to the rebalance code.

I dub this one, "The Gold King And His Knights Reboot Edition" Hint: If you don't like it. Reboot!

This one scales way way better. And the algo runs faster too! I am currently running a $100,000,000 portfolio against it. It is currently trailing the benchmark. But I believe it will excel way beyond the benchmark given enough time.

I ran a five year backtest and did some real quick analysis of the result. Still a lot of penny stocks mixed into the bunch. You might want to try setting a minimum stock price of something like $10 and seeing how that skews the results. I don't think any sane person is going to want to buy something trading for less than a dollar.

Ok, I changed the short moving average and long moving average to be fixed. My favorite seems to be 3 and 45.
And I jiggered with the "Get Out" code to trigger at a lower leverage.

I have a couple of full 10 year backtests in progress.

Algorithm initialized variables:
BottomVar 10
TopVar 1
LowVar 15
HighVar 45

I just ran another 5 year 1k backtest. This time, there's a pretty noticible jump (also present in your backtest Charles) relating to a purchase of $HERO on 2015/08/21 at $0.06 per share which is then sold on 2015/11/11 at $9.44 per share. I tried poking around Barchart and Yahoo Finance, but I was not able to verify if this was legitimate stock data or not.

For those who are interested/more dedicated than I am, the stock can be found using HEROQ or Hercules Offshore, Inc.

Algorithm initialized variables:
BottomVar 100
TopVar 10
LowVar 66
HighVar 96

I'm posting this for the most part without comment. I have some ideas to test.

Algorithm initialized variables:
BottomVar 100
TopVar 10
LowVar 15
HighVar 45

I added a $1.00 price filter and for testing purposes fixed the LowVar 15 and HighVar 45.

Algorithm initialized variables:
BottomVar 100
TopVar 10
LowVar 15
HighVar 45

This is a nice algo, I think the only issue is the high beta. Whenever the market goes down (i.e. 2008/2009, 2015) so does this algo. I guess that is the problem with being stocks only - there is a lot of market risk.

If we could maybe switch to SH or do some sort of hedging when a bear market is sensed, I think it would help the draw down a lot.

Mohammad, if it were possible to achieve a CAGR of 56% who would care about a 24% drawdown? The question is "is it"?
Same question as this other thread

I'd love to believe it possible......but....

The problem is not so much the amount of the drawdown, but the length of time between peaks. When the system is losing money for 2 years during a recession - no investor can be expected to keep having faith in the system.

There is no guarantee the system will turn around and become profitable, so they will cut their losses and stop the system way before 2 years.

The same is true of the S&P 500. Which is itself a systematic way to invest. There is no law of pysics which sates that the US stockmarket's upward bias will continue for ever. In fact it is most unlikely. A two year period between peaks is nothing in systematic trading.

The deeper point is "does Charles' system really work or is it a figment of back testing". If he he has discovered some genuine market truth then......whoopee! If not, then shucks, there goes my daughter's new pony.

It makes sense that low volume stocks would be an open opportunity that had been missed. The first thing you do when making algo is throw them out since you can't use them for entering a large position.

Regarding Charles Witt's most recent post and its attached algorithm: What is the justification for these settings?

    set_commission(commission.PerTrade(cost=0.00))  
    set_slippage(slippage.FixedSlippage(spread=0.00))  

If you remove these settings, the algorithm underperforms the benchmark.

A two year period between peaks is nothing in systematic trading.

It depends who you ask. In my books, an algo that has a two year period between peaks is worthless.

I dont classify all systematic trading as the same. To me, an ETF is very different to an algo because there are additional risks involved in deploying an algo.

For the commission, my reason is that I use Robin Hood brokerage which has zero commissions. Maybe there is a reason to not use zero in the backtest, I am not currently aware of it.

For the slippage, I am only using limit orders. Not being an expert, I understand that to mean that slippage is avoided because I am not using market orders.

I may be wrong, but the partially filled orders from large orders do not necessarily hurt my long term results. The unspent money from a partially filled (or completely unfilled for that matter) order is available for the next day's attempt to enter a position at a discount.

slippage is avoided because I am not using market orders.

Have you done any research into how Robin Hood turns a profit? Limit orders can be exploited, if you can front-run them.

Have you tried the strategy with real trades? A few dozen trades might be enough to include a rough model of the speed of execution you're getting in the algorithm.

I would like to try this out in real time trading soon. Been messing with it by changing around numbers and tweaking things here and there but the only thing that matters is if it works in the real world. I will also be using Robinhood for this. Trial run will start tomo. Wish me luck.

I am live trading an older version of my algo that does not filter under $1.00 priced stocks and I think it is using the 1 - 10 low volume universe (I think). I actually have used my experience with that to adjust my algorithm sometimes. For example, the rejected day pattern trades for my attempted sells. I adjusted the code to avoid selling the same day that they were bought. It is still possible to attempt to enter a position later in the same day that I had exited it earlier .

It does, so far, appear to be doing what I told it to do - namely getting in at or below limit price and getting out at or above limit price. At this point, I'm not sure how Robin Hood's methods of reaping profit are effecting my results. I suspect, that at worst, my live trading returns will be worse than the backtest, but still profitable to me personally.

In effect, I've only been live trading with a fairly recent version of this algo for a few days so far.

it's odd that slippage would be factor if you're using limit orders. Does it result in fewer orders executing?

James: Yes, limit orders can be front-run in such a way that you're less likely to get filled if the price movement favors you.

https://www.reddit.com/r/investing/comments/4uczkd/front_running_and_limit_orders/

UPDATE: I like the current version enough that I'm gonna use it for live trading. For my personal live trading, I removed the price filter and did 1 and 45 volume universe.
Algorithm initialized variables:
BottomVar 100
TopVar 10
LowVar 1
HighVar 45

I have some ideas that I'm testing to improve the performance for high volume stocks. Maybe if that works, I'll dub that version, "The Gold King Gets High". 8)

Quantopian doesn't model front running. These are also relatively small orders on the side of the market that is often ignored (which is why they are low volume)

If slippage affects the results of small limit orders on low volume stocks it would seem the slippage isn't being modeled correctly.

I think the only way to be sure is to live trade with real money

In my books, an algo that has a two year period between peaks is worthless.

I agree that it is not a pleasant prospect but you will find very few hedge or other funds which manage to avoid this. In the long term, almost certainly none.

@Delman - looks like that $HERO stock went into bankrupcy and then emerged in Nov 2015. Im not sure how the shares were restructured or if the algo correctly valued its shares, but it seems to have held on through the entire Chapter 11 procedure.

It bought the shares on 8 Jul 2015, the company then went through Chapter 11 and the stock was delisted. It then emerged with a new loan and started trading again in Nov 2015. Im not sure if 1 share pre-Chapter 11 was equivalent to 1 share post-Chapter 11 - I was not able to find the data online.

@Charles

re: "UPDATE: I like the current version enough that I'm gonna use it for live trading. For my personal live trading, I removed the price filter and did 1 and 45 volume universe.
Algorithm initialized variables:
BottomVar 100
TopVar 10
LowVar 1
HighVar 45"

Does this version work with RobinHood? I notice in the code it set_long_only() but it also seems to take quite a few positions from day to day. Have you seen RobinHood flag it for day-trading?

In practice, I've seen the RobinHood day pattern protection work very well. And I've used my RobinHood live trading experience to adjust the code. For example, day trades are 99% avoided. It is still possible for it to attempt to buy the same stock it sold earlier in the trading day. If that happens frequently enough, I'll go ahead and add code to avoid it. But it does wait till at least the next day before it tries to sell the stock it just bought.

Keep in mind that a large percentage of the buy limit orders are cancelled at end of trading day. That's by design. The orders that do get filled are sometimes the low price of the day from what I've seen.

Charles, any update on the reboot live trading performance?

Ok, I made some small changes that appear to have big improvements in efficiency and returns.

Orders are now never cancelled, except at end of day. So no more cancelled "Take Profit" limit sells except at end of day.
During live trading, I saw one stock at 144% profit got its limit sell order cancelled apparently by the rebalance code.

StopWait now effectively starts fresh, to properly give the stocks a chance to come alive again.
The only thing that keeps it in Wait mode is in a continually downward price trend, which is as it should be.

I found a cash balance logic error. Fixed it.

Charles,

That spike at mid Nov-2015 (and actually half your returns all together) are from 'HERO'. No idea what this is and if the data is good/bad, but attached is a back test that filters it out in the pipeline. Still a solid looking algo for low capital.

Andrew, I think that with $HERO filtered out, that is the most accurate backtest to date.

I made a comment on $HERO previously. Although it was unusual, it is actually valid.

See my previous comment for more info.

I'm fairly new to quant-driven trading.

@Charles, or anybody else who knows, do you have any more info on the theory behind this algo? Maybe a paper somewhere or just a brief overview of the theory behind it?

Adam,

I'm just a guy who wrote an algorithm for trading that seems to do extraordinarily well with low volume (less popular) stocks. If you read the whole thread, you will notice that I was not targeting low volume stocks on purpose. In fact, I initially thought that it was "randomizing" to some degree the selection. That has mostly been proven to not be true. The original code was selecting from the 1 to 10 % volume universe - the lowest of the low, in other words, of volume traded stocks.

My uneducated guess at why this works so well:
1. the very low price stocks allow, in the first place, for a larger basket of stocks, especially for smaller portfolios
2. because the volume traded is so low, the price is much more volatile, which plays well with the mechanisms in my algorithm

Point 1 may be completely irrelevant, because I've tested with $100,000,000 with the high volume basket of stocks, and it still did not get anything close to similar returns as low volume stocks do.

The algorithm has evolved, eventually arriving where we are now.

As far as, "Are the results real?", that remains to be seen. In my live trading for about one week now, I have seen very similar patterns, compared to the back tests. No huge returns yet, but definitely similar patterns.

Good catch Mohammad, I read your post but apparently it didn't stick with me.

Adam - I'm also interested in the theory as I don't understand it as well. I'm also not sure if Luca's previous point about the bid-ask bounce is impacting this algo. I think that's where live or paper (would paper trading address that?) trading would help solidify the logic. I understand the logic but don't see the theory. The gold component doesn't appear to be too significant... I'm waiting for someone with some more knowledge to provide some insight on what's going on. But maybe this is a valid strategy for a low cap universe. That would certainly be nice, I hope Charles makes some money on his work

@Adam Chavez, I'm sitting here with Charles. Here's a quick and dirty insight:

  1. This algo trades in stocks that are small in value. Losses are not huge, so no major issues.

  2. These 'bottom feeder' stocks have, mostly, nowhere to go but up. Charles is using the 100 worst performers. The chances that all 100 of them will tank in the same time period is very small, so the algo will make money for the most part. He's also using the 10 best performers over a rolling 45-day period which provides him with a solid base for security.

Mostly, though, think of this from a statistics point of view. Most data sets - stocks included - can be accurately displayed on a bell curve. Now, the vast majority of the data is in the central 90% of the curve and those are the stocks this algo seeks to invest in. The stocks that are are outliers on the edge of the curve are shaved off by the algo limits.

Also, since a lot of the stocks are low-priced, the sheer volume of stocks that will be bought by the algo kind of hedges its bets: it simply buys enough different stocks that the chance of all of them tanking is minimized.

It's a heroic effort in any event. Congratulations.

It is FUN. Fun to trade and huge fun to make those sort of returns if it works. Who cares if it for the small account hobbyist? If it does work you can trade it up from tiny capital and that is really the object of all small traders - to make a fortune from nothing. And nothing wrong with that if it can be done.

Hope it works for you.

Very curious results. I hope someone gives us an update to their live trading in a couple of months.

Edit: Is this basically a mean reversion algorithm? Buy very low and sell high!

Apologies to the live traders of my algorithm. I got this deployed minutes before market opening this morning. But just now posting to this thread.

The "start fresh" change I mentioned above plus what I changed overnight with the gold stuff seems to have improved results demonstrably.

I've been studying the patterns of IAU Gold. It seems that 2% is the typical max change up or down in a day. So the 6% discount get in order was never getting filled. So I changed it for the "Other" gold only to be 2% discount get in and 1% take profit. Take profit is 12.5% for everything else.

By the way I prefer gold over bonds because:
not only does gold tend to go up when the stock market goes down
but gold, a surprising amount of time, also goes up when the stock market goes up

Notice, with the last 2 changes mentioned, that the max drawdown is now below 10% and most of the time is below 5%.
Also, the drawdowns are smaller both
horizontally (length of time)
and
vertically (size)

1 million percent return? Good lord, man. There needs to be a reality check!

I would say the final numbers are not a good way to look at it because they are too big and can be misleading. I find the best way to look at long backtests is it's CAGR - which is about 165% for the above algo.

I wish Q would just display the CAGR in their stats.

Ok, sorry it took so long to implement this behavior. It is logical.

Take Profit sell will cancel any existing order for that stock. So now Take Profit sell takes priority.

Charles
I have put a lot of work in over the last few days and will post an algo eventually. My research so far in different market environments is that neither gold nor the best stocks are adding any value. I have thus stripped them out. I am also cleaning up my version since there seems to be some unused features which for me at least cloud my understanding of the basic concept.

I should hasten to add I have no experience whatsoever of penny stocks or of trading in this timescale. The big, big question is whether the use of limit orders is going to avoid the perils of spreads which can approach 100% - bid 1c offer 2c.

I have no idea at all whether paper trading would answer this question - I have to assume not because it is better to err on the side of caution. If I could open a Robinhood account and start trading this I would. Its the only way to get the answer. And with $1,000 to $10,000 who cares!

Sadly, until such time as RH opens in the UK I guess I am stuck.

In any event thank you for all your efforts and well done indeed on the concept.

Incidentally, I find Quantopian mind bogglingly impressive and have nothing but admiration for the incredible framework they have provided. My only real frustration is the inability to export spreadsheets of the information I really want to drill down into.

I know it can be achieved in code and logic but I would dearly like to see such statistics as % winners / losers, r sq of equity curve, length of trades, length of winners, length of losers, CAGR and so forth.

I guess I'll just have to get off my butt and write the code.

I intend to try it in a few days with Robinhood. It should be a good test for the efficiency of Robinhood. However I kind of doubt this was what they had in mind when they started the company. In my backtests the "best" stocks did seem to help a little. I did not see it trade any gold. Maybe it should be
renamed the "Soaring sparrow and the bouncing cat"

Ok, in the process of trying to make it somehow work in the contest, I ended up radically changing certain portions of it. A big change is the "get out" behavior. Also, you'll find that the 10 year backtests are much slower now. I also changed the gold code to allow it to play a bigger role.

This is the version I am live trading on RobinHood with. It still needs to be thoroughly tested. I noticed some sell orders getting rejected unexpectedly. Not sure how significant that is yet.

Thanks Anthony and others for your testing. I figured if I keep posting my changes and the backtests, it would gradually get better.

By the way, in the interests of full disclosure, I have experienced, apparently, one delisting (BINDQ), in my live trading. Not sure what the end result will be, but I had about $90 in that one. There is a buy or sell button for that one, not sure what would happen if I tried to sell or buy it. Quantopian tries to sell it as BIND and it gets rejected. In the RobinHood phone app, it says, "This stock is not currently listed on an exchange.".

"By the way, in the interests of full disclosure, I have experienced, apparently, one delisting (BINDQ), in my live trading."

@Charles, I think that's the fatal flaw of the algo. I'm starting to see that Quantopian really doesn't handle liquidity risk well.

Does anybody know if Quantopian's backtesting engine even looks at delisting? Or does it only trade stocks that are still currently being traded?

@Adam - I agree.
It seems Q keeps the value of the delisted stock as last quoted - see HERO in my previous comments.
Tracking a delisted stock is really complicated - often it goes to OTC with a "Q" on the end. So HERO becomes HEROQ.
Delisted stocks usually lose a lot of liquidity. In the real world, it takes balls of steel to keep holding on to delisted stocks as this algo does.

In backtesting I believe that desisted stocks cannot be sold so they accumulate in you protfolio and hold cash.

Here is some code to remove them, although it does not subtract their value like is should (if you actually lost the money like you probably will)

#exit soon to be delisted stocks
if (stock.end_date - get_datetime()).days <= 3:
order_target(stock, 0.0)
log.info("Closing deslisted %s" % (stock.symbol))

I wonder if there's a lot of delisted stocks in a long backtest. It would significantly affect your return since delisted stocks in Quantopian is automatically converted to cash.

Greg,
That code looks useful. How effective has it been so far in your backtests or live trading? It looks like it attempts to get out of a position if it is within 3 days of delisting.

I guess I'm wacky. 8) I'm waiting to see how many of these actually effect me and what the final outcomes are. I'm certain I'll get more concerned, as more money gets tangled / lost in these.

Some "news" on BINDQ:
" On July 2, 2016 BIND Therapeutics, Inc. (BIND), a biotechnology company developing targeted and programmable therapeutics called ACCURINS®, announced it has filed a motion for court approval of a stalking horse asset purchase agreement bid from Pfizer Inc. (PFE) for the purchase of the majority of BIND’s assets.

The agreement is the initial stalking horse bid under Section 363 of the U.S. Bankruptcy Code, to be followed by an orderly auction process as established by the U.S. Bankruptcy Court. Under terms of the agreement, Pfizer has agreed to acquire substantially all of BIND’s assets for approximately $20 million in cash subject to certain price adjustments. Pfizer has also agreed to assume certain contractual liabilities of BIND.
"

20 million shares outstanding.

So about $1 / share less liabilities. That's probably where the otc price comes from. What did your algo pay?

I wonder if there's a lot of delisted stocks in a long backtest. It would significantly affect your return since delisted stocks in Quantopian is automatically converted to cash.

Do you have an algorithm handy which demonstrates this? Seems like a potentially significant bug.

It's not a bug but a decision by the developers: https://www.quantopian.com/posts/delisted-securities-now-removed-from-portfolio
You have to be careful when backtesting penny stocks. I've attached a backtest that buys Aeropostale stocks before they went to OTC. You can see the cashback.

I would recommend something like Greg Rollins example. Liquidate your stock before it is delisted. For live trading, I guess you just gotta watch your stocks like a hawk and manually liquidate when the delisting announcing comes up.

Thanks.

"20 million shares outstanding. So about $1 / share less liabilities. That's probably where the otc price comes from. What did your algo pay?"

98 shares average cost $0.9273

The 5 year backtest keeps timing out apparently. Here is 2006.

It's not a bug but a decision by the developers: https://www.quantopian.com/posts/delisted-securities-now-removed-from-portfolio
You have to be careful when backtesting penny stocks. I've attached a backtest that buys Aeropostale stocks before they went to OTC. You can see the cashback.

Ouch, yes. So in these back tests you are guaranteed never to lose all your money on a position.

You can either look for statistics like: what percentage of delisted stocks ended up at zero or make your own I suppose by combing through delisted stocks systematically. I have delisted stocks from CSI data going back to 1985 and it includes bulletin board and Pink Sheet prices (I think).

If you own the stocks in depot with Robinhood can you deal in them on the Pink Sheets or BB? If so, all may not be lost.

Nasdaq delistings

GNOW

Take GNOW for instance. It was delisted on 19th May 2016 and continues to trade otc: on 1st August the price listed is c 19 and over $7,000 worth traded. And yet Q has the last traded date as 3rd Jan 2014.

data = get_pricing('GNOW')  
data.tail()  
    open_price  high    low close_price volume  price  
2013-12-27 00:00:00+00:00   1.5200  1.5200  1.4322  1.4800  3410    1.4800  
2013-12-30 00:00:00+00:00   1.4700  1.5499  1.4700  1.5400  11048   1.5400  
2013-12-31 00:00:00+00:00   1.5801  1.6400  1.5801  1.6400  1084    1.6400  
2014-01-02 00:00:00+00:00   1.6300  1.6600  1.6200  1.6499  2627    1.6499  
2014-01-03 00:00:00+00:00   1.5500  1.6000  1.5400  1.6000  766 1.6000  

So you certainly can NOT say this idea is crap it doesn't work. You just have to take pot luck and test it out.

Incidentally some of these insane spikes in stock prices may well be caused by consolidations which have not correctly been adjusted for.

I added the .985 (1.5% immediate drop get out) code back in just in case there is a strong downward trend when you're trying to get in.

There is no obvious change to this backtest in the graph. However, I think it is a potential lifesaver in a downward trending market.

Here is a run where I have kept track of the delisting losses separately. I assume they are total losses so this gives a worst case result. I tried subtracting them from the portfolio "cash" but that did not do anything. I think the portfolio is read-only.
It is a severe problem and eats up almost all of the gains!! Perhaps a fundamental criteria could be added to the screen to exclude stocks likely to fail?
Note I reduced the number of "worst" stocks from 100 to 20 due to the lower starting cash.

Something else I noticed in the backtest above . Close to the end, (7/26/16) it is buying SAEX for $0.15/share.
Yahoo and google show about $18.00/ share.
Then on 7/27/16 it buys more at $18/share. Then the last sale price drops to 0.15 again. Clearly something is messed up in Q ordering code.

Greg,

SAEX underwent a 1-for-135 reverse split on 7/27/2016.

Yahoo prices are split adjusted as of today. Prices on Quantopian are adjusted as of the simulation date. In a backtest, you will see the price change on the day of the split event. You will actually see the price change on the date of the split event as you would have in real life. If the algo was holding SAEX when the split occurred, the number of held shares would have been adjusted accordingly.

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

I think it may be worth looking stock by stock to see if they traded after Q ceases to give prices. As per my example above. Either that or simply suck it and see.

Ok, not sure if this helps much with the delisted stocks problem.
I did however add some code to the rebalancing code to avoid buying and get out of soon to be delisted stocks.
I also moved the rebalance decision to a scheduled at market open function. That fixed some obvious problems from having it decide to possibly rebalance every minute when it was sitting in the handle data function.

Charles
Here is what I think needs to be done. Take a few different sample periods. Use Greg's code to list de-listed stocks and their end price. One by one investgate such stocks to see what price if any one would have obtained in reality. In some cases this will be easy such as with GNOW. In other cases it will be more difficult such as if the stock does not continue to trade OTC.

I'm not sure whether Q has OTC prices: in the case of GNOW apparently not.

I believe I do: I have CSI's database of de-listed stocks. I will put some work into it over the next few days.

Until that sort of investigation is done I don't think any serious conclusions can be drawn. The alternative is simply to observe what happens in reality going forward.

In other words these back tests are useless as they stand. We have no idea which de-listed stocks will produce a complete loss and which will give us some or all of our money back.

Here is one small example of why it really does not make a lot of sense to take back testing this strategy on Q much further:
WAVX
Last price on Q 9th Feb 2016. It was de-listed but continued to trade OTC. Closed on Q at 7 cents, closed the next day at 4 cents. Its ticker is now WAVXQ.

1st to 18th Feb 2016 the system I am running traded 8 de-listed stocks. Imagine checking up on each de-listed stock over a 10 year back test period.

Hmmmm.....

If Q could be persuaded to extend the history of de-listed stocks through to the OTC market that would be a whole different ballgame.

Ok, this version should be more predictable in live trading with Robin Hood. Decisions are not made in the first 30 minutes or the last 30 minutes of the trading day. Also, I suspect that the warming up behavior might otherwise cause unexpected behavior, so this way no decisions are made during that first several minutes of the trading day. Yesterday morning, my algorithm told RH to sell most of my stocks, and I was not expecting that to happen.

Charles
In practical terms how are you dealing with de-listed stocks? WAVX changed to WAVXQ when it was del-isted. Q carries no information for WAVXQ. This situation will be repeated again and again frequently over the coming months and you will have to manage the many exits manually on de-listed stocks even assuming they are actually "re-listed" OTC.

I have not looked at your latest code but I guess you are still acting on the assumption that delisted stocks can be exited on the last traded price.

if not ((stock.end_date - get_datetime()).days <= 3):  

I fear this is look ahead bias which will simply not work going forward. As you will have seen from the Nasdaq link I posted there is sometimes notice of delisting and in those cases I guess you can manually exit any positions before that date.

In back testing however in a system like this which is so vastly dependent on soon to be de-listed stocks, I very much fear some assumptions need to be made as to what percentage of the last traded price will be achieved overall on delisted stocks.

I'm not pouring cold water on the idea and would dearly love to give it a go myself. Merely pondering how to make a more accurate assessment.

I agree with Anthony.

Adding in a filter that gets rid of any company at risk of failure would be a good move. Maybe only looking at companies with a current ratio above one and with positive earnings and price momentum over the last 12 months.

I tried filtering out anything with a bad financial grade and no stocks were traded at all.

How much, if any, would filtering out below $1.00 stocks help with the delisting risk? Or excluding the bottom, say, 10% of the volume traded universe?

In unrelated news, I scheduled all open orders to be canceled right before market close. to avoid broker-canceled orders at end of day filling up the logs.

I also tweaked the Take Profit code to improve live trade behavior.

Hi Charles, I just ran your most recent algorithm and saw that there was a purchase of 170 shares of IAU at $11.63 total of 1977.10. Although this did make a profit 2 days later by selling for 2035.75, as one who have been tweaking around your algorithm and following this thread for awhile, I don't think I have seen such a big purchase compared to capital size. ($1977 was 50% of the initial capital). Should it be more logical to be putting a more aggressive stop for such buys?

Also, I am currently building a small program to extract all sales off a backtest and check if any of the stocks were delisted stocks or not. will post it when I finish that and hopefully it could help facilitate this buying delisted stocks problem of the algo. (I actually have been running your algo in robinhood live trading and it actually purchased a delisted stock $EPRSQ. Thank got it was a small purchase (112 shares totalling $15.68), but if i imagine this purchasing it in the amount of 50% of my capital, I would be pretty devastated.. even so I have high hopes for this algorithm! great work so far!).

http://www.otcmarkets.com/stock/EPRSQ/chart

Still trading on the OTC market

 price = USEquityPricing.close.latest  
    AtLeastOneDollar   = (price >= 2.00)
 BottomVar = 100  
 LowVar = 1  
 HighVar=45  

Only 1 delisted stock bought and sold

EPRSQ is not able to be traded on Robinhood currently; the message RH gives is: "This stock is not currently listed on an exchange". Although it is currently on OTC market, I believe Charles set his algorithm's platform on Robinhood. Also what's your thoughts on my back test buying a single stock with 50% of the capital above?

Oh god re EPRSQ! Re gold I have cut it out entirely hence the problem won't arise.

A large part of the returns of this algo is because of stocks below 1$, so filtering might reduce the number of delisted stocks, but is more of a hack around the actual problem (and would generate smaller returns). Is there an auto-updated csv of all delisted stocks or similar? You could write a function that checks this list daily against your portfolio and adjust accordingly. If there is I might take a look when I got a chance to.

Non Marketable Limit Orders

A bug in zipline which further distorts and inflates these sort of back tests I fear.

Hey guys, I am live trading this, so alot of the little changes I make are to improve my RobinHood live trading experience.

I added some decision window logic just in case the algo is started mid trading day (from a live trading point of view).
I added logic to ensure that Take Profit code only cancels orders that are not Take Profit orders.
I also fixed the initial pipeline weight to be the more expected .95 instead of .50.
Sorry about taking so long to fix that. It was a relic from a very early version of the algo.
All orders are canceled at decision window start and end - works even if stop / start algo multiple times per trading day.

Algo cannot start mid trading day

I don't think the following code is at all reasonable or rational:

set_slippage(slippage.FixedSlippage(spread=0.00))  

At the very least it ought to be replaced with some variety of:

set_slippage(slippage.VolumeShareSlippage(volume_limit=.01, price_impact=0.0))  

It is simply unrealistic to expect that you can trade more than x% of available volume. But you can still assume your limit order will protect you on price.

I cant stand repeating things: instead of the schedule list:

    for minutez in xrange(30, 330, 30):  
        schedule_function(DecisionsStart, date_rules.every_day(), time_rules.market_open(minutes=minutez))

I got this bizarre result the other day. It bought 2 shares of RIVR on 7/19/2012 at about $1000 per share. The price should have been $20/share.
Apparently it was $20 when it calculated the number of shares to buy, and again immediately after. The results is
a huge deficit in the account balance. Things like this make it difficult for me to trust Q enough to put any money into it.

You can greatly reduce the number of delisted stocks by requiring companies have positive earnings. All the delisted stocks I looked at went to the OTC
market. Even if they have value I imagine it is a pain to get your money.

Is there a way to alter this algorithm so that it will not violate regulatory rules on day trading? I have less than $25,000 in my account.

Nick,
I live trade this on Robin Hood and it does reasonably well at avoiding day trades. Robin Hood does have day trade protection, which I use, but I have code which avoids most of the day trades.

Thanks everybody for comments.

Thanks Anthony, I implemented some of your suggestions / changes.

Thanks Peter, I implemented your loop for the scheduled function.

Posting this, because it has enough changes in it to be significant. Also, it curiously, appears to wait till the market is at least sideways before getting in.

By the way, it appears that my old method of trying to exit near cost (even if the sell orders are slow to fill) ends up more profitable. The drawdowns are much bigger though.

The method I am using now attempts to keep the drawdowns small. But, so far, the total returns seem to be less.

I changed the rebalance signal to make it more predictable.

Posting a 2 month test of the previously posted 1 million percent in the 10 year backtest from earlier.
Ultimate reality check in progress.
I went ahead and turned this version of the algorithm on for my live trading today.

Status: Live Trading
Type: Robinhood real money
Started Trading: market open Wednesday, 8/10/2016
Capital base: $1,650.61

As of market close Friday, 8/12/2016
-0.41% Returns
($6.69) Dollar P/L -24.89 Sharpe
$469.84 Long Exposure $1,174.08 Available Cash

I'll keep you posted with my live trading results. I am planning on leaving this alone for awhile to see how it does.

Do you have a weekend update on the week's performance?

Status: Live Trading
Type: Robinhood real money
Started Trading: market open Wednesday, 8/10/2016
Capital base: $1,650.61

As of market close Friday, 8/12/2016
-0.41% Returns
($6.69) Dollar P/L -24.89 Sharpe $469.84 Long Exposure $1,174.08 Available Cash

As of market close Wednesday, 8/17/2016
0.45% Returns
$7.39 Dollar P/L 3.04 Sharpe $709.56 Long Exposure $948.44 Available Cash

Thanks for the update, even if it was uneventful, it didn't blow up. So good work.

how are you dealing with delisted stocks/splits?

My understanding is that in backtesting quantopian just liquidates the delisted securities, but they stay open in live trading.

Hey Tyler, I just have a scrub for once a month to sell 30 day average price below $1.10 then another loop to sell stock that have entered the OTC markets.

something like this in a scheduled function for once a month (end of month)

def Cleanse(context, data):
Curr_Price = float(data.current([stock], 'price'))
price_history = data.history([stock], \
fields = 'price', bar_count = 30, frequency = '1d')
mean_history = price_history.mean()
if data.can_trade(stock):
if stock.symbol not in context.day_buy_list:
if float(mean_history) < context.average_price_threshold:
order(stock, 0, \
style = LimitOrder(Curr_Price * 1.00))
context.day_buy_list.add(stock.symbol)
log.info('\nSELLING All shares of "%s" due to delisting threat.' \
% (stock.symbol))
elif morningstar.share_class_reference.exchange_id.latest.startswith('OTC'):
order(stock, 0, \
style = LimitOrder(Curr_Price * 1.00))
context.day_buy_list.add(stock.symbol)
log.info('\nSELLING All shares of "%s" because it was delisted.' \
% (stock.symbol))

Hope this helps or brings ideas.

-lyell

I really like the cleanup code on this. I think it helps paint a fair picture on those that do get delisted or are about to!

I think this is fantastic. What would be the best way to eliminate the buy aaple trigger from this code?

What was the significance of buying aapl in the other algo?

Hey Tyler,
I had the Idea to influence the actions of the Algo without stopping and changing the code.
The Signal Stock could be anything you choose before you launch the Algo.
So I tested it out by using a signal position (Apple). So I always had to have at least one share of Apple in my positions. Now, to stay in cash say after a large profit or to wait for funds to settle or the market downturns and I want out . All I have to do is sell Apple from my RH app and bam there's a cascade of sells to sit in cash. But my Algo is still churning and ready for some buys if I pickup one share of my signal position again.

Just an experiment to influence my Algo after launch. Any amount of actions can be put into loops like this. Multiple signal positions to move about the market differently through different strategies.

Thanks.

Gottcha.

So do you first have to buy 1 share of aapl or does the algo automatically do that the first time?

Then once you sell aapl do you hvae to manually buy it again?

Yup,
It's used like a button. so I could launch my Algo with no activity. Then next month buy one share of the Signal position to fire things up for real.
The Buying and Selling of the Signal is all manually. But for Backtesting I needed to add a "buy" loop for my signal position to get it running.

So yes you would need to grab a position either before or after you launch the Algo. Be sure you have the rest of the loops always ignore that specific position too.

inherently there will still be discrepancies between backtesting and live testing due to Quantopian liquidating the day the stock gets delisted, right?

Lyell, i'm running the code you posted in live paper trading and there are really weird -2.5% dips - have you encountered this before?

HeyTyler,
I do not understand the question. In what aspect are the dips taking place. In your overall portfolio percentage?

I had one dip like that today. It's most likely a low average volume stock doing a dip in price where the price is constant for a bit then there is a sudden price change to let a market maker sell or buy some shares. This is the behavior of the waters we tread with this Algo.
if you increase the "average daily volume" variable to 20K + and the "universe lookback" to 100 days you might have better luck avoiding these.
Raise the minimum price to 2.50 + also

yea, wish I could attach a screenshot, but the total portfolio randomly twice spiked down to -2.5% then jumped right back up.

But none of the positions had this spike

Also, what are you views about the backtesting still not being 100% acurate?

Ya I understand the inaccuracies of this unattractive dataset for back testing. The theory though is that in this realm of volume ~1 in ~10 companies may have a large spike for you to catch. Then for the companies under preforming you are only in them for a hundred bucks or so and maybe they hold long enough to do the 3% exit instead of 25%.
Also in the 2008 2009 comeback and exponential growth of the backtest, we might be picking up companies that have usually been strong, but due to panic sell and waiting they meet our threshold of inactivity and low volume only for the markets to rally and interest to come back to them.
That's why this is a bottom feeder.
Classic Hot hand fallacy when we profit for a 10K to 30mil ten year backtest.
Test live in low capital until you see the pattern of the backtest coming through each week. Not much more then faith with this one.
Good luck.

Lyell, are you trading your code live? If so, would you be willing to provide how it has performed?

I am trading it live. There isn't much action in price so far but it's not negative after 2 days. Not enough data to see whether it's working correctly or not.
There are a steady amount of buys and the second day it was able to sell for a small profit in the under preform loop.
I'll update next week with a better analysis.

It's alive!
It's cool to see the interest in this algo.
I think Lyell's "bottom feeder" theory about this algo a couple of posts back makes sense.

Lyell,
It seems plausible to influence the algo behavior like you do by manually making / canceling a limit order for some stock that logically is not likely to ever execute. So that would mean actually having a position in it would not be necessary. You, instead, would check for the existence of manually created limit orders, that only serve to influence the behavior of the algo. Sorry, no time to code stated idea right now.

Ok, as a slight update. I just now compared a backtest using same dates and starting cash as my live trading, and a light analysis shows similar results and behavior, so far so good.

As an exercise in fun and "I wish I did that", if I had immediately started live trading this version of the algo back on 07/29/2016, I might have been at 5.8% profit by now. Of course, the whole delisting problem might have erased some / much of that.

Charles are you still trading with the algo posted 1 week ago? Any moved to OTC/delisted/splits?

Will this algo sell stocks that I already own if I had it go live?

Yes. You should not be mixing your manual trades with an algo in the same account. Create a separate account for each algo/system.

Any updates?

I have found there are large differences between backtest and live trading due to some stocks not available on Robinhood

This algorithm seems very promising, and I think the idea behind it is good. However, it seems to have two major issues:

i) transaction costs not being taken into account in the backtest. Whilst Robinhood has no commission I think placing limit orders will still affect prices.

ii) more importantly, there is the issue with delisted stocks and how they are handled very poorly by the API. I am certain you will not get the returns given on the backtest as is.

To address the first issue, I think slippage should take some non zero value. As for the second, there is a real risk in live trading that a stock will be delisted and become worthless. Would it be possible to come up with a metric that is able to predict when a stock will likely become delisted, and then avoid that? This would certainly improve performance in live trading.

Live Trading Robinhood real money
Started Trading: market open Wednesday, 8/10/2016
Capital base: $1,650.61

As of market close Tuesday, 8/30/2016
-2.22% Returns / +1.00% Backtest / -0.10% Benchmark
($36.56) Dollar P/L -5.71 Sharpe
$522.02 Long Exposure $1,092.03 Available Cash

"Any moved to OTC/delisted/splits?"
no delistings
However, for some previous delistings from trading live a different version of the same algo.
There are two:
BINDQ Total Return +$5.16 [+5.68%] Average Cost $0.9273 Market Price $0.98 OTC I think / yes RH has a SELL button, have not tried it
KEGX Total Return -$6.26 [-55.94%] Average Cost $0.16 Market Price $0.0705 OTC I think / yes RH has a SELL button, have not tried it

Feel free to make any tweaks you wish to the algo. I decided to trade the version that got the biggest returns in the backtest, pimples and all. Not expecting it to match the backtest, but hoping it turns out to be profitable. Not giving it any more money, until it gets some track record and shows persistent similar patterns like the backtests. At some point, I will make some tweaks to the code, to get rid of the pimples, but for now, just letting it do whatever it does.

Very interesting, please keep posting updates. Is there a reason why you haven't tried to sell your delisted stocks yet?

"Is there a reason why you haven't tried to sell your delisted stocks yet?" Just curious if they will eventually list again.

Ive been running your algo and I have a problem where it does not purchase on certain days until I manually start selling some stocks. Is there a max limit of amount of stocks you can have on robinhood? Also I have 4k invested and am running into problem of algorithm not using all of 4k. It only uses around 2k ( half). And finally, no stocks are bought using more than 40 dollars, and the average stock bought are ~20$ which is 0.5% of my capital.

These are my observed behaviors. With this I have made 0.2% gains so far.

I have had issues with Robinhood cancling orders that the algorithm places. I don't see why, I have not hit their day trading limit.
It may be because my order is the first one of the day. If I trade the same stock from my phone it goes through.

The other problem is that is is difficult to determine the price when there are only a few trades per day.

Has anyone done a side-by-side comparison of Robinhood and IB?

"Is there a max limit of amount of stocks you can have on robinhood?" I have not heard of or noticed any.

Keep in mind, there are quite a few versions of my algo on this thread. I am referring to the "million % in ten years" version, warts and all.

"algo only using about half of capital" Multiple factors cause this:
The weight assigned to gold or Other almost never holds any positions, so it is almost always cash, less than 100% leverage.
The limit buys only get in at a "6% discount", otherwise no purchase is made. This will always cause less than 100% leverage.
The cash variable in my_rebalance is adjusted for every placed order, not executed orders. This will always cause less than 100% leverage.
Sometimes RH will not trade a particular stock, so the weight assigned to it is "wasted".
If you have "day trade protection" set to on with RH like I do, then some trades will be rejected, thus wasted weight.
Other factors exist as well, such as slippage, partially executed orders, etc.
Some of these things are "pimples" and some may play a role in contributing to the profitability of the algo.
Time will tell.

anything new here?

The quantopian community is really amazing and I like the collaborative culture here much more than the secretive practices of Wall Street. Just to help the community avoid lossing money in extreme cases, I'm putting my two cents here.

  1. The source of profitability for the Gold King. It’s risky to take those trades on illiquid stocks because when market crashes, few people will be willing to buy and selling will be very hard in those cases. The abnormal profits are mostly rewards for taking the liquidity risk.

  2. Caveat. Switch to manual mode and use market orders to clear positions in market crashes. Even with this in mind I’m still worried that in those cases institutional investors’ orders will get filled first and who knows what price can we get from Robinhood.

I tried going through the latest updated code for this algo last night after shift, and got stumped trying to eliminate the gold hedge. What would it look like if this algo just went all in on the pipeline generated stocks? And is there a way to apply some of the morningstar fundamentals to its stock picks in order to increase the quality of some of its trade selections?

Increasing the minimum buy price to $2.00 or requiring the companies have positive earnings reduces the number of defaults, but also reduces the profitability according to the back test. I let it run live for about a week. It had no trouble buying stocks, (I only bought ones above $1.50) but not many were being sold. The spreads are large and some of the stocks it bought don't trade at all on a given day. The stocks I ended up with are now about 3% under water. I put limit orders on then and them equal to what I paid and will see if I can get my money back.

Live Trading Robinhood real money
Started Trading: market open Wednesday, 8/10/2016
Capital base: $1,650.61

As of market close Friday, 9/9/2016
-4.07% Returns / +0.60% Backtest / -2.20% Benchmark
($67.17) Dollar P/L -7.92 Sharpe $814.02 Long Exposure $769.42 Available Cash

Not really commenting on this stuff. Just reporting for transparency and because I know some of you want to know what the results will be for live trading.
I know some of you others have been reporting your live trading results too, which also helps the rest of us.

Hey Charles,

First, I am seriously admire your consistency regarding your fantastic algo.

I s there any indication for results with INTERACTIVE BROKERS for any of the versions of your algo? Maybe you can pressume if it'll work efficiently on commission Account like IB?

Thanks
-Naor

I am a fan of IB. I just don't have enough money yet to have an account. But I consider IB to be possibly the best.
So, I think you would get different results with IB.
No day trading restrictions because you have enough money in your account.
I think IB would likely get superior order execution.
There are some commissions however, but quite small from what I understand.

If any of you all have IB and are live trading any version of this algo, please let us know your experiences.

I'd be realy glad to experience your algo on my IB account, but pattern day trading isn't allowed for me.
I can be leveraged X2.

But I don't really know which of the versions of your Algo I should take... It will be great to hear what you thinking.

For the most part, my algo avoids day trades. There are a very small number that still happen. RH has day trade protection, which to me seems a bit overly aggressive in its protection. But I use it just to make sure I don't get flagged as pattern day trader. Does IB have day trade protection?

If whatever algo you are trading with now works for you, then I would suggest continue with it. My risk tolerance is quite high and I have been known to have nerves of steel. So you all might be better off letting me take the risk.

We just had a bad 30 days start for my algo. Today it did a "Stop Wait" monthly function, which among other things, resets the MaxPV which allows for a "fresh start". So, the algo will now behave as if it is starting over at zero percent profit / loss. It does this I think on the 16th of every month.

Thanks Charles, Guess I'll just change the cost in the original algo...

Definitely have protection of pattern day trading... I can't execute more than 3 trades in a day.
I have risk tollerance that's as high as you I guess :)

I am very new in Quantopian so I haven't got the API that much til' now...
But your Algo seems so promising...

Good to hear that IB also has day trade protection.

The version of the algo that I am trading is the "million percent in 10 years" version.

Feel free to modify the code to fit your situation better.

It needs some warts removed, but I am waiting for more live trade experience to prove overall profitability or NOT.

I guess you don't care about commissions since you're robin-hood trader.

But when I am selecting 3$ commission per trade, Backtesting shows it isn't profitable anymore. Any solution?
And well, I thought you gone for the latest version, which settles few problems down. Why not actually? Just because of the higher returns?

IB's commission is $1 + 0.005 per share.

Here is the code to emulate:

    set_commission(commission.PerShare(cost=0.005, min_trade_cost=1))  

Thanks a lot Mohammad!
Any other configurations you see as needed in Order to implement into REAL MONEY LIVE TRAdING AT IB?

@Naor Porat: Go see the new thread "The King Forgot His Shorts".

This thread is getting a bit long in the tooth.
The Gold King And His Knights continues in the thread Robin Hood Throws A Party.
It is a slight modification of The King Forgot His Shorts.

I am now live trading with Robin Hood Throws A Party instead of the "million percent in 10 years" version.
The final results for that version which I started live trading August 10th is:
Live Trading Robinhood real money "The Gold King And His Knights - million percent in 10 years version"
Started Trading: market open Wednesday, 8/10/2016
Capital base: $1,650.61

As of market close Friday, 9/16/2016
-4.40% Returns / -0.20% Backtest / -2.20% Benchmark
($72.64) Dollar P/L -7.28 Sharpe $627.32 Long Exposure $950.65 Available Cash

One year ago I put in live paper trading Andrew Martin version of The Gold King And His Knights.
Here is its out of sample paper trading performance:

Live Algorithm Settings
Status:
Live Trading
Type:
Quantopian paper trading
Started Trading:
7/29/2016, 9:03:53 AM
Capital base:
$10,000.00

101.43%
Returns
$10,142.57 Dollar P/L
4.89
Sharpe
$8,724.99 Long Exposure
$0.00 Short Exposure
$11,417.58 Available Cash

@Vladimir, thanks for posting those results! I came across this thread the other day looking for gold trading strategies on Q and ran a forward-looking backtest and was impressed. Were you running the SPY-instead-of-IAU version or the HERO-less version? @Charles, you could have doubled your money in the past year! Great job with this one.

A lot of people have been talking about how inaccurate/inflated Q's backtest results are when it comes to low liquidity stocks. Running a backtest over the same time period:
RETURNS 92.7%
SHARPE 4.72
It appears paper trading matches the backtest, and if anything the backtest is more conservative. Can anybody speak to how paper-traded results compare to using an actual brokerage? Anybody have data from the same algo running in both paper trading and through a brokerage? How does how paper trading take handlesvolume, price direction, bid/ask spread compare to the backtester? or is it basically the same thing?

quantopian paper trading uses the same order fill assumptions that backtesting does. this means that the paper trade results won't work like live trading for low liquidity stocks. the resaon is that quantopian is not a brokerage and has no way to know what stocks will or won't get filled

I have ran a few of these algo's in live IB and the fills are totally different from the back test or paper trade. About 50% of the fill in paper/backtest I got in Live. When I found that, I stopped the algo and discarded penny stock/illiquid stock algo's. My opinion after burning few thou $$$: If it's not in the Q500/Q1500, it prob not worth it, unless using etfs which are not always in the Q1500 but do have sufficient liquidity/low bid/ask spreads.

Chew/Peter/Blue KangarooFish

Do NOT get too excited. I am back live trading Robin Hood VIX Mix Extreme Vetting. It is still a work in progress. However, it does have a more reasonable feel to it.

The micro cap low volume stocks are gone, finito, fired, no longer a part of my family. The ExtremeVetting function uses Q500 worst performers.

There is a significant drop, apparently, in Max DD. This is much improved over the previous version.