I'm trying to make a really simple pipeline custom factor that returns linear regression over a windowperiod of time for each security in the input.
My current code keeps giving back array index errors. I feel there's probably a better/easier way to do this?
class LinearReg_slope(CustomFactor):
#linear regression slope
inputs = [USEquityPricing.close]
window_length = 50
def compute(self, today, asset_ids, out, Prices):
#get amount of columns in the numpy input array:
rows, columns = Prices.shape
#Feed each column of the array to the talib function that takes a 1D array
for i in range(0, columns):
out[i:] = talib.LINEARREG_SLOPE(Prices[i,:], rows-1)[-1]
The error I get is: IndexError: index 50 is out of bounds for axis 0 with size 50
When I simply do this:
out[:] = talib.LINEARREG_SLOPE(Prices, 49)
I get an "AssertionError: real has wrong dimensions" on a completely different line of code which is not very helpful.