google cloud datalab: reload set from bigquery leads to RespondNotReady
Using Datalab for analysis, I need to sample data from bigquery. When I refresh my sample, I sometimes get this error: /usr/lib/python2.7/httplib.pyc in getresponse(self, buffering) 1059 # 1060 if self.__state != _CS_REQ_SENT or self.__response: -> 1061 raise ResponseNotReady() 1062 1063 args = (self.sock,) ResponseNotReady: This behaviour happens randomly while iterating (100 iterations). Any clue?
That is probably the underlying socket closing. I've opened https://github.com/GoogleCloudPlatform/datalab/issues/707 to track. Thanks!
How can I programmatically give Cloud ML access to a bucket?
Running cloud datalab kernel on my own server?
How can I get the Cloud ML service account programmatically in Python?
Google datalab : how to import pickle
Is text the only content type for %%storage magic function in datalab
Do I need to update gcloud components as indicated in docker startup
How to import user-written custom modules in google datalab?
Access to Google Cloud Datalab Fails with ssh Error
Cloud Datalab permissions - 403 on VM URL when sharing access
Do I need to manually specify the project on docker Datalab?
DATALAB does not start correctly
Datalab Notebook Answer y/N in prompt
Error on deploy
How can i load my csv from google dataLab to a pandas data frame?
Recommended approach for installing and using new kernels?
How do you logout/switch accounts?