Quantopian's community platform is shutting down. Please read this post for more information and download your code.
Back to Community
Connection Error raised when trying to get_fundamentals in research

Hi everyone,

I got a 'Connection error' when I try to run this code (copy from Lecture "Quantopian Lecture Series: Ranking Universes by Factors"):

end_date = '2015-01-01'
fundamentals = init_fundamentals()
pe_data = get_fundamentals(query(fundamentals.valuation_ratios.pe_ratio), end_date, range_specifier='36m')

The error is :


ConnectionError Traceback (most recent call last)
in ()
5 fundamentals = init_fundamentals()
6 # WARNING: The following line will take a while to run, as it is fetching a large amount of data.
----> 7 pe_data = get_fundamentals(query(fundamentals.valuation_ratios.pe_ratio), end_date, range_specifier='36m')
8 #pe_data = pe_data['pe_ratio'].T.dropna()
9

/home/qexec/src/qexec_repo/qexec/research/api.py in get_fundamentals(query, base_date, range_specifier, filter_ordered_nulls) 104 base_date,
105 range_specifier,
--> 106 filter_ordered_nulls,
107 )
108

/home/qexec/src/qexec_repo/qexec/research/api.py in get_fundamentals_with_namespace(ns, available_date_tbl, query, base_date, range_specifier, filter_ordered_nulls) 126 filter_ordered_nulls,
127 )
--> 128 return client.get_fundamentals(query_dict)
129
130

/home/qexec/src/qexec_repo/qexec/research/web/client.pyc in get_fundamentals(self, query_dict) 281 query_dict['params'],
282 query_dict['columns'],
--> 283 query_dict['return_panel'],
284 )
285

/home/qexec/src/qexec_repo/qexec/research/web/helpers.pyc in method(args, *kwargs) 126 @wraps(f)
127 def _method(
args, **kwargs):
--> 128 response = f(
args, **kwargs)
129 response.raise
for_status()
130

/home/qexec/src/qexec_repo/qexec/research/web/client.pyc in _make_get_fundamentals_request(self, query, params, columns, return_panel) 267 ),
268 ),
--> 269 headers={'Content-Type': 'application/json'},
270 )
271 return resp

/usr/local/lib/python2.7/dist-packages/requests/api.pyc in post(url, data, json, *kwargs) 107 """
108
--> 109 return request('post', url, data=data, json=json, *
kwargs)
110
111

/usr/local/lib/python2.7/dist-packages/requests/api.pyc in request(method, url, **kwargs) 48
49 session = sessions.Session()
---> 50 response = session.request(method=method, url=url, **kwargs)
51 # By explicitly closing the session, we avoid leaving sockets open which
52 # can trigger a ResourceWarning in some cases, and look like a memory leak

/usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json) 463 }
464 send_kwargs.update(settings)
--> 465 resp = self.send(prep, **send_kwargs)
466
467 return resp

/usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in send(self, request, *kwargs) 571
572 # Send the request
--> 573 r = adapter.send(request, *
kwargs)
574
575 # Total elapsed time of the request (approximately)

/usr/local/lib/python2.7/dist-packages/requests/adapters.pyc in send(self, request, stream, timeout, verify, cert, proxies) 413
414 except (ProtocolError, socket.error) as err:
--> 415 raise ConnectionError(err, request=request)
416
417 except MaxRetryError as e:

ConnectionError: ('Connection aborted.', BadStatusLine("''",))


Can anyone suggest me what should I try to fix it?

Best,
Thomas

4 responses

Thomas, it looks like you're getting a timeout. Clearly that query worked in the past when the notebook was originally published but it might be that there is higher load somewhere in the stack that is preventing it from finishing. You could try to reduce the size of the query by either filtering down the number of companies returned (right now that query is unconstrained) or you could reduce the number of data points from 36 months worth down to a smaller size.

Thanks
Josh

Disclaimer

The material on this website is provided for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services by Quantopian. In addition, the material offers no opinion with respect to the suitability of any security or specific investment. No information contained herein should be regarded as a suggestion to engage in or refrain from any investment-related course of action as none of Quantopian nor any of its affiliates is undertaking to provide investment advice, act as an adviser to any plan or entity subject to the Employee Retirement Income Security Act of 1974, as amended, individual retirement account or individual retirement annuity, or give advice in a fiduciary capacity with respect to the materials presented herein. If you are an individual retirement or other investor, contact your financial advisor or other fiduciary unrelated to Quantopian about whether any given investment idea, strategy, product or service described herein may be appropriate for your circumstances. All investments involve risk, including loss of principal. Quantopian makes no guarantees as to the accuracy or completeness of the views expressed in the website. The views are subject to change, and may have become unreliable for various reasons, including changes in market conditions or economic circumstances.

Hi Josh Payne,

Thanks a lot for that very quick reply. By filtering the data, I can successfully query fundamentals.

Have a nice day,
Thomas

Josh, I don't think this is the whole story. It's not the amount of data; there's some issue with range_specifier.

As an example, see the attached notebook. I ask for 1 factor (net_income) for 1 stock. I set range_specifier = '2y'. This should be 2 datapoints, right? Still fails.

Has there been progress on this. I am experiencing the same problem with data for only 10 quarters.