Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
date-constrained algorithm execution

I wrote some simple code to illustrate how algorithm execution could be constrained to specified dates. In the example, I specify a list of dates on which to execute algorithm #1 and a list of dates on which to execute algorithm #2. The approach could be extended to an arbitrary number of algorithms and their associated dates.

For this trivial example, algorithm #1 submits buy orders and algorithm #2 submits sell orders. The idea is to insert non-trivial algorithms applicable to various historical dates.

I'd appreciate guidance on how to improve the framework. Specifying the dates is kinda clunky...perhaps there is a cleaner approach (e.g. reading them from a file)?

Also, perhaps it is straightforward to extend the approach down to the hour/minute? I welcome suggestions.

7 responses

@Grant,

There is an object in datetime, which is also called datetime. datetime has resolution to the subsecond, so you could specify to the minute with:

import datetime

def initialize(context):  
    context.a_minute = datetime.datetime(2009, 12, 2, 13, 31)  

The question that pops to mind is: where would these day and dates come from? Is there a logic or rule to choosing the dates? If there is it may make more sense to code the rule, rather than statically specify the dates. If there isn't a rule, then I would caution you that hardcoding the dates has extremely heavy lookahead bias. Choosing buy and sell dates is akin to choosing Apple for a long position - you're likely picking the historical buy/sell dates based on what you know now.

Could you tell us a bit more about the dates?

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

Hello John,

I should have mentioned it...the dates in the example above were pulled out of the air as an illustration.

I'm looking for guidance on the mechanics of using Quantopian to look for correlations between historical real-world and market events (e.g. price movements). If the correlations persist into the future, then trading algorithms could be developed around them (as you say, "code the rule"). The first step is to be able to provide Quantopian the external data for analysis. The simplest example I could cook up was to buy and sell on specified days. This enables backtesting a simple trade on "X" strategy (where "X" is news, weather, internet traffic, Google lookups, etc.). I don't have any "X" events to test yet, but my motivation to find some will go up if it is convenient to do the backtesting on Quantopian. Make sense?

By the way, here's a relevant link that may be of interest: https://www.recordedfuture.com/this-is-recorded-future/recorded-future-api/

Do the Quantopian backtest database "event" entries always occur at integer/whole minute values?

@Grant - that makes perfect sense.

Building a basic datasource to wrap recorded future into Zipline would be quite easy - python has excellent json support. There may be some tricky performance problems, and I'm not clear if recorded future maintains a history of the events they mine. Do you know?

The data in our database is minute bars. Our db has open, high, low, close and total volume for each minute of the exchanges are open. The time will always be the end of the bar, and should be an integer minute with no seconds or microseconds. So, the first bar for a trading day will be datetime stamped as 9:31 eastern. Also note that all times are quoted in UTC, which has no daylight savings time.

Thanks John,

I just set up an account on Recorded Future to get a feel for how it works.

Can users access the open/high/low/close/volume for each minute? On your help page, I only see:

price is the price of this security as of this event.
volume is the volume of the last trade for this event.

I'll see if I can modify my code to constrain trading down to the second, but it'll be clunky. It would be better if the external data could be somehow merged/synched with the internal Quantopian backtest data. Any ideas?

@Grant,

We have the data, but unfortunately there is a bug in our api preventing algorithms that reference the high/low/open/close from passing validation. Actually, we have a frustrating hole in our internal unit tests. Whenever we add to the api, we need to add new tests. In this case I forgot to add test cases for those fields, and as a result our unit tests are 100% passing, because they never attempt to access the new fields. The bug is logged now, and we'll fix it soon.

Regarding your data sync, zipline provides a component that serializes the data in chronological order across multiple sources. You just need to guarantee that your events are in chronological order as you generate them, and then zipline wll make sure they are subsequently ordered with respect to all other events.

You can safely assume that seconds and microseconds will always be zero for the datetime of events, so I don't think you need to lock the events down to the second.

Am I answering your question?

Thanks John,

I'm clear on the open/high/low/close prices and total volume for each minute--you are working to make all of the values accessible for backtesting.

With regard to your zipline data sync discussion, I'm not following all of the details. As I understand, your current definition of an "event" for a given security is an entry in the database with a whole/integer minutely UTC time stamp. The event is simply a historical record of the open/high/low/close prices and total volume for that security. Your backtester is "event-driven" in the sense that, for a given security, it only interacts with the market when there is an event indicating a market exists for the security (Dan Dunn provides a clear explanation here). So, what you call an event might be better defined as a "market trade event" to be distinguished from other database entries with time stamps ("extramarket events"). By definition, extramarket events don't indicate the existence of a market, so order submission and fulfilment can't be triggered off of them (although perhaps extramarket events could be used to schedule future order submissions triggered by market trade events).