gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE

gsutil cp
gsutil cp wildcard
gsutil rsync
gsutil login
gsutil unzip
gsutil commands cheat sheet
gsutil ls
gsutil tutorial

I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:

GCE> gcloud auth list
Credentialed accounts:
 - 1234567890-compute@developer.gserviceaccount.com (active)

I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:

local> gsutil acl ch -u 1234567890-compute@developer.gserviceaccount.com:W gs://mybucket

But then the following command fails:

GCE> gsutil cp test.txt gs://mybucket/logs

(I also made sure that "logs" is created under "mybucket").

The error message I get is:

Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission               0 B  

What am I missing?

One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.

For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.

See the section on Preparing an instance to use service accounts for details.

Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.

After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to @mndrix for pointing this out in the comments.)

cp, Use the gsutil cp command to copy the image from the location where you saved it to the bucket you If successful, the command returns a message similar to:. Description. The gsutil cp command allows you to copy data between your local file system and the cloud, copy data within the cloud, and copy data between cloud storage providers. For example, to

gsutil config -b

Then surf to the URL it provides, [ CLICK Allow ]

Then copy the verification code and paste to terminal.

Quickstart: Using the gsutil tool | Cloud Storage, But still was not able to copy the new file to the bucket. https://stackoverflow.​com/questions/27275063/gsutil-copy-returning-accessdeniedexception-403-  To avoid incurring charges to your Google Cloud account for the resources used in this quickstart, follow these steps. Open a terminal window (if not already open). Use the gsutil rm command with

You have to log in with an account that has the permissions you need for that project:

gcloud auth login

Exercise: Service Accounts and Scopes, gsutil is one of several different options for using Google Cloud Storage. If not, please make sure to go back to steps 1 and 2 before proceeding further. will use the command gsutil cp * gs://<YOUR_BUCKET_NAME> (Fig. google-cloud-storage, gsutil copy returning “AccessDeniedException: 403 Insufficient Permission” from GCE One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.

I have written an answer to this question since I can not post comments:

This error can also occur if you're running the gsutil command with a sudo prefix in some cases.

gsutil: Command-Line Control of Google Cloud Storage, I've been doing regular syncs to my nearline bucket like so: gsutil rsync -r /mnt/​backup gs://my-bucket-name It's suddenly stopped working and  gsutil cp – Copy and Move Files on Google Cloud Platform. gsutil cp/mv command is mainly used to perform actions on the files or objects on the Google Cloud Storage from your local machine or from your Compute Engine Virtual Machine. You can also use gsutil wildcard to sync multiple objects to GCS.

  1. Stop VM
  2. got --> VM instance details.
  3. in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then Click "save".
  4. restart VM and Delete ~/.gsutil .

    

gsutil rsync suddenly returning AccessDeniedException: 403 Caller , command from the python API: gsutil -m cp -r myfolder gs://mybucket/ You can NotFound: return client.create_bucket(bucket_name) def get_files(path): for  https://github.com/GoogleCloudPlatform/gsutil/blob/master/gslib/commands/cp.py#L1946. Here we return the destination URL by itself because only a single object exists in the local ./test2/ directory. I believe we should only call _ShouldTreatDstUriAsSingleton if recursion is NOT requested.

Storage: Parallel operations for copying objects · Issue #36 , gsutil cp gs://my-awesome-bucket/kitten.png gs://my-awesome-bucket/just-a-​folder/kitten3.png. If successful, the command returns: Copying  gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE. gsutil cors set command returns 403 AccessDeniedException

How to copy an object to a folder in the bucket?, Return type. google.cloud.storage.blob.Blob. Returns. The new Blob. Example. Copy a blob including ACL. >>> from google.cloud import storage. >>> client  The quickstart shows you how to set up a Google Cloud project, enable billing, install gsutil, and run basic commands with the tool. If another individual has already set up a Cloud Storage account and has added you to the project as a team member, or if you have been granted access to an object or bucket, you can get gsutil as part of the

Buckets, The gsutil cp command allows you to copy data between your local file. system and HEAD. Note: At present copy-in-the-cloud doesn't return the generation of. The XML API will accept any cache control headers and return them during object downloads. The JSON API respects only the public, private, no-cache, and max-age cache control headers, and may add its own no-transform directive even if it was not specified. See gsutil help apis for more information on gsutil's interaction with APIs.

Comments
  • Was the GCE VM created with full control or read/write GCS scope?
  • Thank you for pointing that out. I was not aware of that option indeed. I re-created the instance with the option enabled and it worked. If you could suggest to turn on the flag as an answer I would happily flag it.
  • I recreated my instance with the permissions and it all works now. Thanks
  • As of now, you can edit the scopes. Stop the machine - edit - then change the Cloud API access scopes. I believe this has only be available for about a month now.
  • After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again.
  • Thanks you so much mndrix !
  • @mndrix saves the day
  • Note this doesn’t work if you run gsutil from ssh.
  • It "doesn't work" in so far as not opening your browser. Instead, it provides a url for you to manually copy and paste.
  • This also works fine for everyone else if you drop the -b. That will not open a browser, but will simply spit out a url that can be opened outside of the shell.
  • Do not forget to issue gcloud auth revoke <email-account> after you are finished.
  • This is really a comment, not an answer. With a bit more rep, you will be able to post comments.
  • I have given the answer (+1) so that @TheLoneDeranger will be closer in reputation to the 'Post comment' privilege.