Dealing with large zip uploads and extracting using google cloud
I am trying to create a site for e-learning courses (zips html/css/js/media) to be uploaded to. I am using go on google app engine with google cloud storage to store the zips and extracted courses. I will explain the development dead ends I have encountered. My first thought was to use the resumable upload functionality of cloud storage to send the zip file, then read it using go on app engine, unzip the files and write them back to cloud storage. This took a while to read and understand the documentation and worked perfectly for my 2MB test zip. It failed when I tried it with a modest 67MB zip. I had encountered a hidden limitation when accessing cloud storage from app engine. No matter the client I used there was a 10MB/32MB limit. I tried both the old and new libraries as well as blobstore. I also looked into creating a custom oauth2 supporting client library using sockets but hit too many dead ends. Giving up on that approach I thought even though it would mean more uploading, perhaps just extracting on the client (browser) side then uploading each file with it's own resumable upload would make the most sense. After exploring a few libraries I had extracting in browser working ready to upload. I wrote my handler that created the datastore entry for the upload, selected a location for the upload and created all the upload urls. When testing this I was finding that it would take a while to go through generating the long lists of files (anything over 100). I decided that it would make sense since I was using to to make the requests concurrently. I spend a day or two getting that working. After dealing with some CORS issues that weirdly did not show up earlier I had everything working. Then I started getting errors when stress testing my approach with a large (500mb) zip/course. The uploads would fail and I discovered that when trying to send 300+ files to generate upload urls I was getting the following error Post http://localhost:62394: dial tcp [::1]:62394: connectex: No connection could be made because the target machine actively refused it. now I have no idea how to diagnose this. I don't know if I am hitting a rate limit and if I am I don't know how to avoid it. This seems like creating this should be simple, but it is anything but. I have a few options I can pursue Try to create the resumable uploads with a batch operation(https://cloud.google.com/storage/docs/json_api/v1/how-tos/batch) batch operations to /upload are not supported. Maybe make requesting each url a one by one api call. Make requesting the url happen over a channel (https://cloud.google.com/appengine/docs/go/channel/reference) spend the next week or more adding layers of retries and fallback error handling. Try another solution. This should be simple. How should this be done?
Trying to get an SSL certificate from Comodo to work for Google App Engine
Login loop when adding a custom domain for appengine
InvalidClassException: No valid constructor
CookieJar does not catch incoming cookies
Google Cloud Console Projects, Advantage To Have Different Elements Under Same Project?
No api proxy found for service “datastore_v4” in new GAE 1.8.4
Which Google App Engine installer do I want to use for Windows?
Using Access Token from google cloud to login to Gmail
GAE : Yahoo, Google & Facebook login support
Google App Engine : JDO deletePersistent not consistent
Objectify 4 throwing strange error in GAE
How can I use my Google API email #developer.gserviceaccount.com
How to log messages in GAE go runtime?
Error with PHP SDK (Windows)
Increase Per-User Limit for Google API: Saving not possible, always error message “Your input was invalid”
GWT: form post works only on the local server, not with the app engine