Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Is there any method to get TTM data :)

Is there any method to get TTM data :)

5 responses

Much of the ratios calculated by TTM metrics can be found in the Fundamental dataset: link

However, should that not be enough, I recommend creating a CustomFactor with a window length of 365/whatever you're looking for that takes in price data, fundamentals, etc. to calculate whatever metric you're after. Good luck!

Isn't there a risk that 365 days will not contain 4 quarters of fundamentals?

It seems feasible that announcement dates will shift around by a few days due to weekends, holidays, etc.

365 days will typically always contain 4 quarters of fundamentals but not for the obvious reason. The first thing to note is all days on the Quantopian platform (eg window_length in a factor) are trading days and NOT calendar days. There are roughly 252 trading days in a calendar year. However, US companies have 45 calendar days to report previous quarter data (about 35 trading days). Plus it takes a few days to record this data. Companies can, however, get an extension and file after 45 days but probably not miss a whole quarter. So, to be on the safe side, maybe use a number bigger than 252 + 35 + 5 = 292 trading days, or to be on the 'paranoid' side use 252 + (252/4) = 315 to capture 4 quarters of data.

Back to the original question in this post "Is there any method to get TTM data" . Yes. Many of the FactSet data fields are available as Last Twelve Months (LTM) values which are the same as TTM. See the docs https://www.quantopian.com/docs/data-reference/factset_fundamentals#last-twelve-months-ltm.

For quarterly Morningstar fundamentals one needs to use a small custom factor to sum up the the last 4 quarterly values. Doug Baldwin posted a very good vectorized version using numpy to do that in this post https://www.quantopian.com/posts/trailing-twelve-months-ttm-with-as-of-date.

I also coded a pandas version to do the same thing. It's slower than the numpy version but took only 10 minutes to write. I find the ease of use with pandas makes it a great tool to do initial coding. Unless time is critical, it's often fine to do the job and faster to code.


class TrailingTwelveMonths_Pandas(CustomFactor):  
    """  
    Gets the sum of the last 4 reported values using pandas  
    Uses a window length of ~5 quarters (252/4 x 5) to ensure we have at least 4 quarters  
    Requires 2 inputs: the datset values and the associated asof_dates

    """  
    window_length = 315

    def compute(self, today, assets, out, values, asof_date):  
        # Make a single multi-index dataframe with asof_date and values as columns  
        # The days are the index level=0 and securities are level=1  
        date_df = pd.DataFrame(asof_date).stack().rename('date')  
        value_df = pd.DataFrame(values).stack().rename('value')  
        df = pd.concat([date_df, value_df], axis=1)  
        # Add a column with the asset to make drop_duplicates easier  
        df['asset'] = df.index.get_level_values(level=1)

        # Drop duplicate dates, keep only the latest value for each date  
        df_dropped = df.drop_duplicates(['date', 'asset'], keep='last')  
        # Take only the latest 4 dates. Then, filter out any groups without exactly 4 qtrs of data  
        df_top_4 = df_dropped.groupby(level=1).tail(4)  
        df_exactly_4 = df_top_4.groupby(level=1).filter(lambda group: group.date.size == 4)

        # Sum the values where there are exactly 4 unique asof_dates to get TTM  
        value_sum = df_exactly_4.groupby(level=1).value.sum()  
        # Make a series of nans to hold our output values (will defualt to nan if no TTM value)  
        output_sum = pd.Series(np.nan, index=range(len(assets)))  
        # Put our TTM values into the series  
        output_sum.loc[value_sum.index] = value_sum  
        out[:] = output_sum

Attached is a notebook showing both approaches.

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

Hi Dan, I am a newbie here, trying to get fundamental LTM/TTM data. Is there a reason for varying results for np vs pandas version?

I noticed a few differences in your notebook:
Equity(1332 [CCEP]) 0.30 NaN
Equity(1419 [CERN]) 0.18 NaN
Equity(52529 [ACA]) 0.10 NaN
Equity(52594 [ETRN]) 0.86 NaN

@Jason Teo Good catch! I looked over the numpy version of the custom factor and realized it will return a value if there are less than 4 quarters of data. It just sums what it has. The pandas version will return nan. This probably isn't what one wants?

That's the reason for the differences above. The numpy version is returning the sum of the quarters it has while the pandas version returns nan. I've made a few changes to the notebook to show this including a column for the count of the quarters.

Again, good catch.