Cloud Datalab permissions - 403 on VM URL when sharing access
I've successfully created & hosted a Cloud Datalab VM. I can access the VM's URL from my account (Project Owner) but my collaborators can't access the *.blogspot.com URL (HTTP 403) unless I give them "Project Owner" permission. 403 screenshot How do I properly share access to the Jupyter server (Cloud Datalab VM) with my team ?
I am assuming you are referring to appspot.com URL for the App Engine Flex VM(formerly Managed VM) - such as main-dot-datalab-dot-.appspot.com The version you have requires minimum editor permissions in the project because a Datalab user can perform tasks beyond those scoped for reader permission. They should not require owner permission. Separately, we released a beta refresh yesterday so I would urge you to upgrade soon as it fixes issues as well as adds features. See https://cloud.google.com/datalab/docs/quickstarts/ to get started with the refresh. Thanks. Dinesh Kulkarni (Product Manager, Datalab)
How do I attach a local ssd to a Datalab instance?
Is it possible to create a google cloud datalab instance not using datalab-network?
fail to create repository [datalab-notebooks] for Project
bigquery - current_date() inconsistent result?
Can I export to Google Sheets from Datalab?
Using bq command in datalab - credentials needed
Can I create kernel gateway endpoints in google cloud datalab?
Fastest way to read big amounts of data in Google Datalab?
Datalabs project not found
Datalab front get stuck when opening notebooks, and buttons aren't working
How can I programmatically give Cloud ML access to a bucket?
Running cloud datalab kernel on my own server?
How can I get the Cloud ML service account programmatically in Python?
Google datalab : how to import pickle
Is text the only content type for %%storage magic function in datalab
Do I need to update gcloud components as indicated in docker startup