google cloud datalab: reload set from bigquery leads to RespondNotReady
Using Datalab for analysis, I need to sample data from bigquery. When I refresh my sample, I sometimes get this error: /usr/lib/python2.7/httplib.pyc in getresponse(self, buffering) 1059 # 1060 if self.__state != _CS_REQ_SENT or self.__response: -> 1061 raise ResponseNotReady() 1062 1063 args = (self.sock,) ResponseNotReady: This behaviour happens randomly while iterating (100 iterations). Any clue?
That is probably the underlying socket closing. I've opened https://github.com/GoogleCloudPlatform/datalab/issues/707 to track. Thanks!
Datalab Notebook Answer y/N in prompt
Error on deploy
How can i load my csv from google dataLab to a pandas data frame?
Recommended approach for installing and using new kernels?
How do you logout/switch accounts?
Running sklearn 0.17 in Google Cloud Datalab
Sharing the datalab notebooks
Is there a list of the datalab keyboard shortcuts?
In datalab, Is it possible to pass data into a chart from python?
What is the best way to stop execution of a cell on a Google Datalab notebook?
How do I quickly get data out of a Google Cloud Datalab notebook?
is it possible to use the discovery module from the Google apiclient in Cloud Datalab?
%%chart line graph in Datalab based on Bigquery data not rendering
Notebook - Keyboard Shortcuts: Ctrl-K, Ctrl-J (Not Working)
Files written to disk get deleted after a while
Using persistent disks with google Datalab