Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Template: Limit Leverage in Your Algos

After finally digging into Quantopian I found that many of the algorithms were so profitable thanks to infinite borrowing. After dealing with this in my own code in various ways, I decided to create a standardized template that everyone is welcome to use. Besides limiting your percent invested, another added benefit is that I have initialized a pandas dataframe that stores the closing prices. Therefore you can call pandas pd.stats.moments in your algo, instead of coding something up in the batch_transform. Its really just a few functions that holds everything together

def update_portvals(context):  
    """ updates account information on demand as opposed to beginning of frame"""  
    context.total_equity = context.cash + context.portvalue  
    context.pct_invested = context.portvalue / context.total_equity  
    context.pct_cash = context.cash / context.total_equity

def generate_order(sym, size, context):  
    """circuit breaker 1: enough cash for transaction?"""  
    if context.cash < size * context.df_prices[sym].ix[-1]:  
        return  
    """deduct price from cash and add to port value"""  
    context.cash -= size * context.df_prices[sym].ix[-1]  
    context.portvalue += size * context.df_prices[sym].ix[-1]  
    """update account values"""  
    update_portvals(context)  
    """circuit breaker 2: will this transaction set percent invested  
       greater than .95, before slippage and commissions?"""  
    if context.pct_invested > .95:  
        """if breaker was fliped, reset values to before order called"""  
        context.cash += size * context.df_prices[sym].ix[-1]  
        context.portvalue -= size * context.df_prices[sym].ix[-1]  
        """update account values"""  
        update_portvals(context)  
        return  
    order(sym, size)  
    log.info(str(size) + ' ' + str(sym.symbol) + ' at ' + str(context.df_prices[sym].ix[-1]))  
def liquidate_position(sym, context):  
    """ sells off the entire position """  
    if context.portfolio.positions[sym.sid].amount is not 0:  
        to_liquidate = -context.portfolio.positions[sym.sid].amount  
        order(sym, to_liquidate)  
        log.info(str(to_liquidate) + ' ' + str(sym.symbol) + ' at ' + str(context.df_prices[sym].ix[-1]) )  
"""
not updating portvals since transactions will not fill until next bar  
this prevents the algo from overdrafting this bar on cash received next  
"""
@batch_transform(window_length=1, refresh_period=1)  
def update_data(datapanel):  
    """" gets most recent bar"""  
    if len(datapanel.price.index) is 0:  
        return  
    return datapanel.price  

Look at the backtest source to see how to implement it for yourself. I imagine many of the more sophisticated users out there have already dealt with this in there own way. Nevertheless, hopefully this can provide a stepping stone to get new pythonistas developing their own strategies.

23 responses

NOTE: Forgot to mention that you can edit the max percentage to invest under the generate_order function, just be careful as at 100% you are likely to overdraft when considering transaction costs.

def generate_order(sym, size, context):  
    """circuit breaker 1: enough cash for transaction?"""  
    if context.cash < size * context.df_prices[sym].ix[-1]:  
        return  
    """deduct price from cash and add to port value"""  
    context.cash -= size * context.df_prices[sym].ix[-1]  
    context.portvalue += size * context.df_prices[sym].ix[-1]  
    """update account values"""  
    update_portvals(context)  
    """circuit breaker 2: will this transaction set percent invested  
       greater than .95, before slippage and commissions?"""  
    if context.pct_invested > .95:  
        """if breaker was fliped, reset values to before order called"""  
        context.cash += size * context.df_prices[sym].ix[-1]  
        context.portvalue -= size * context.df_prices[sym].ix[-1]  
        """update account values"""  
        update_portvals(context)  
        return  

@Brandon - awesome contribution. I'm working on a similar set of functions but my concern is centered more on addressing margin requirements for short positions. So, for example, if you open a $1,000,000 long position and a $1,000,000 short position the portvalue should be unchanged and your cash value should increase by $1,000,000. So, by opening two new positions your percent invested has decreased. In reality, one would need to keep a certain amount of cash from the short sale (e.g. the cash generated plus a 20% cushion).

Daniel,

I should have also noted that this was in fact long only. I haven't gone about tackling the L/S methodology just yet, simply because the strategy I am working on doesn't rely on it. I am sure it wouldn't be to hard to extend it by adding a few margin variables. Off the top of my head I would add something like this to generate_order.

context.init_margin += (cash proceeds that would result from new short * 1.50 ) #initial margin requirement  
context.maint_margin += (cash proceeds that would result from new short * 1.25 ) #initialize the minimum amount of cash before margin call  
if context.cash > context.inti_margin:  
    context.cash -= context.init_margin  
    update_portvals()  
    then accept trade  

then if cash becomes less than main_margin you would need to liquidate those margin calls. maint_margin needs to be included in update_portvals so that after cash is reconciled each day you can also update your maint_margin requirements and subtract that from cash.

This is roughly how I would do it, I might be overlooking something but hopefully it leads you in the right direction. If you get to it before me please share with us. I probably wont be able to look at it until I get some other code into production, but this is definitely something the quantopian community desperately needs.

Thanks Brandon, these types of frameworks are really useful. Due to this platform's dynamic nature, having things like your set of functions that can easily be implemented is important. I'll definitely find use for this.

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

Gus,

I am glad you find them useful, Quantopian has opened sourced their code for all of us, so I see no reason why simple infrastructure code like this shouldn't be shared with the community.

@Brandon - I'm a little confused about the use of context.df_prices. I'm decent at python but new to the pandas module. So...

In the initialize function you call

context.df_prices = pandas.DataFrame()  

which is essentially dimensioning context.df_prices as an array. Then in handle_data
context.df_prices = pandas.concat([context.df_prices,update_data(data)]) so you concatenate context.df_prices with itself and the series datapanel.price (which I am assuming is intrinsic to Quantopian's data array's functions and returns a series of price data for.... all symbols in the universe?) So the concatenation adds all these values together? Here is where I get a little lost. And then in generate_order you call
if context.cash < size * context.df_prices[sym].ix[-1]: is ix a reserved variable of the dataframe to specify the ith index of axis x (the rows?) and you call the -1 position....

I suppose I would just assume use the data[sym].price feature to get the price. I'm not sure why the dataframe is required.

EDIT: and I see now context.cash and context.portfolio.cash / context.portvalue and context.portfolio.positions_value are created so that updates can be made on demand as opposed to once per frame.

Daniel,

you could certainly replace this line

if context.cash < size * context.df_prices[sym].ix[-1]:  

with data[sym].price

I used the dataframe solely out of comfort with pandas. ix is slicing along the index, so it is just getting the last price for that symbol.

context.df_prices = pandas.DataFrame()  
context.df_prices = pandas.concat([context.df_prices,update_data(data)])  

These lines here are just appending the prices for the entire universe as they become available. Heres the output so you can see whats happening:
2013-01-03PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 1 entries, 2013-01-04 00:00:00+00:00 to 2013-01-04 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-06PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 2 entries, 2013-01-04 00:00:00+00:00 to 2013-01-07 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-07PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 3 entries, 2013-01-04 00:00:00+00:00 to 2013-01-08 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-08PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 4 entries, 2013-01-04 00:00:00+00:00 to 2013-01-09 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-09PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 5 entries, 2013-01-04 00:00:00+00:00 to 2013-01-10 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-10PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 6 entries, 2013-01-04 00:00:00+00:00 to 2013-01-11 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-13PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 7 entries, 2013-01-04 00:00:00+00:00 to 2013-01-14 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-14PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 8 entries, 2013-01-04 00:00:00+00:00 to 2013-01-15 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-15PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 9 entries, 2013-01-04 00:00:00+00:00 to 2013-01-16 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-16PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 10 entries, 2013-01-04 00:00:00+00:00 to 2013-01-17 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) 2013-01-17PRINT<class 'pandas.core.frame.DataFrame'> DatetimeIndex: 11 entries, 2013-01-04 00:00:00+00:00 to 2013-01-18 00:00:00+00:00 Columns: 721 entries, 16385 to 18431 dtypes: float64(721) I am curious regarding the log date vs the timestamp of the last entry, is quantopian exposing possible postdictive errors?

For my purposes it was important to have the dataframe as I use pandas extensively in my research, therefore translating the code from ipython notebook to here is a walk in the park. So instead of working in the batch transform, I can use pandas methods anywhere in my code. So a 30 day moving average becomes,
pd.stats.moments.rolling_mean(context.df_prices, 30) as well as all of these...

EDIT: and I see now context.cash and context.portfolio.cash / context.portvalue and context.portfolio.positions_value are created so that updates can be made on demand as opposed to once per frame.

Exactly, it took me awhile to figure out cash wasn't updated until the next frame and thats why my if statements were failing to safeguard against margin.

Yeah I guess I would imagine cash would be updated within the order function but guess not... Anyway, I'm understanding this much more clearly now.

I see your concern regarding the possible postdictive error. I'll look into this further but it may require someone from Quantopian's staff.

Yeah the seeming postdictive error could possibly just be a product of how the prices are timestamped. For instance I suppose it would be possible that the data for 01-04-2007 have its closing price timestamped at midnight and thus 01-05-2007 00:00:0000. However if I am wrong this means that the batch transform in every algorithm is plagued with look ahead bias. Hopefully someone from the team can address this.

Let me know if you have anymore trouble working through my code.

Hmm... I think I correctly implemented your code but it would appear I somehow dipped into negative cash territory.

EDIT: YOUR program works just fine. MY function, pause, which buys and holds SPY so that the algorithm mirrors the benchmark does not account for overdrawing the cash balance. So there is a minimal overdraw (negative cash balance) but this is before the algorithm "goes into effect". (NOTE that the pause function I use is to allow the algorithm to mirror the benchmark so that enough days of data are available, i.e. so that 15 days are available if the algo is based on a 15 day moving average)

Strangely I wasn't able to replicate your overdraft??? I attached the backtest were I recorded both context.cash and the builtin context.portfolio.cash, neither seem to go below 0, also I logged the pct_invested and it seems to respect your 95% of total equity restriction.

@Brandon - I just cloned the backtest you posted and reran it with no changes. On the first day I am seeing a purchase for $1,005,499.24 of SPY with a $1,000,000 initial capital and a cash balance of ($5,756.18). I see this under "Position Values" in the backtester. Note that the Log files are from generate_order only and not from the pause function, which uses Order directly.

Ahh... I overlooked your pause function calling order directly, and I believe that to be the issue. Heres why, when you are calling the order method directly context.cash is naive of the transaction and therefore doesn't trigger the circuit breaker. To fix the issue I simply placed my generate_order inside of your pause function. Before I neglected to search through the position values, but after making the changes I just went through every transaction and I don't believe there to be any overdrafts. I find it odd the previous backtest overdraft wasn't visible from the recorded cash values... Nevertheless, see if this code works

You are exactly correct Brandon. I had edited my previous post

EDIT: YOUR program works just fine. MY function, pause, which buys and holds SPY so that the algorithm mirrors the benchmark does not account for overdrawing the cash balance. So there is a minimal overdraw (negative cash balance) but this is before the algorithm "goes into effect". (NOTE that the pause function I use is to allow the algorithm to mirror the benchmark so that enough days of data are available, i.e. so that 15 days are available if the algo is based on a 15 day moving average)

The amended code you posted doesn't overdraw the account, true, but the pause function does not work either. This is going to be because the pause function is calling for more shares than generate_orders will allow. Anyway, pause is merely a function to let the algorithm hug the benchmark early in the simulation and only calls order once regardless of the duration for which the simulation is "paused". So.... even though there is a minimal overdraw, the algorithm does not borrow excessively and the performance of the simulation should be accurate.

EDIT: I'm not sure why your first backtest did not show the overdraw.

You can up the pct_threshold to .99 and gross down your order size by .99 and you still wont get an overdraft.

Brandon - I don't know how I have missed this to date. But this is indeed a fantastic contribution - thank you for posting it!

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

Hey Peter - I found that one a bit later on - thanks!

I will start a new thread on this when we have more progress, but Dan Dunn and I were just talking last night about a plan to add more curated lists of re-useable code snippets like this (already on that list in my mind so far are examples for doing MOC, min variance portfolio and so on). Something akin to the 'Top Shared Algos' list.

If you have other favorite posts that include reuseable code please send them my way. We're also thinking of a another curated list with vetted strategy templates to start from - so like a sample pairs trading strategy, sample moving average cross-over, sample sector rotation and global market rotation strategy and so on.

best, Jess

Hello Jess,

Both those lists sound like a very good idea. A third suggestion would be best performing algos but they would, of course, have to meet some basic criteria i.e. trade multiple SIDs, trade multiple years, never have -ve cash.

I would really like to see that third list including non-shared algos to give us some idea of what is possible. Obviously these would just be the results and not the source and we would never know what curve-fitting is happening behind the scenes.

P.

Absolutely - that is a whole other super exciting area we'll be able to start working once we get to a critical mass with live trading! That is going to be a game changer in my opinion :)

Hey, so I know I for one would totally dig a curated list of re-useable code. Top contributors get some sort of gold star?? Something everyone in the community could strive for?