Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
How to set the optimal value to a trigger signal?

I have an indicator which oscillates between -50 and +50 values.
Then I have to set a optimal signal value to go long or short.

I've thought in something like this:

    if (indicator > ((50 / indicator.mean()) * 1.25)) and (isIncreasing(indicator) == True):  
        # go long!  
    elif (indicator < ((-50 / indicator.mean()) * 0.85)) and (isDecreasing(indicator) == True):  
        # go short!  
    else:  
        # chillax  

Therefore, to sum it up:

  • The mean is usually round -5 and +5, ish.
  • If the indicator is 25% above of the max/mean, and is increasing: go long.
  • If the indicator is 25% below of the min/mean, and is decreasing: go short.
  • If none of previous conditions were triggered, then take a coffee. :)

The point is to stop before the indicator change its direction.
But, as you see, the previous formula is merely intuitive. And what if the mean is zero? Then I would have problems in the division...

So there is any formula to calculate the optimal value to place my signal, in order to prevent go long and immediately after a decrease of my indicator? May be using some kind of pattern regression which calculates the following movement?