Set Google Storage Bucket's default cache control

google cloud storage
google cloud storage bucket url
gcloud create bucket command line
google cloud console
gsutil
google cloud storage bucket id
google cloud storage curl
google cloud bucket

Is there any way to set Bucket's default cache control (trying to override the public, max-age=3600 in bucket level every time creating a new object)

Similar to defacl but set the cache control

It is possible to write a Google Cloud Storage Trigger.

This function sets the Cache-Control metadata field for every new object in a bucket:

from google.cloud import storage

CACHE_CONTROL = "private"

def set_cache_control_private(data, context):
    """Background Cloud Function to be triggered by Cloud Storage.
       This function changes Cache-Control meta data.

    Args:
        data (dict): The Cloud Functions event payload.
        context (google.cloud.functions.Context): Metadata of triggering event.
    Returns:
        None; the output is written to Stackdriver Logging
    """

    print('Setting Cache-Control to {} for: gs://{}/{}'.format(
            CACHE_CONTROL, data['bucket'], data['name']))
    storage_client = storage.Client()
    bucket = storage_client.get_bucket(data['bucket'])
    blob = bucket.get_blob(data['name'])
    blob.cache_control = CACHE_CONTROL
    blob.patch()

You also need a requirements.txt file for the storage import in the same directory. Inside the requirements there is the google-cloud-storage package:

google-cloud-storage==1.10.0

You have to deploy the function to a specific bucket:

gcloud beta functions deploy set_cache_control_private \
    --runtime python37 \
    --trigger-resource gs://<your_bucket_name> \
    --trigger-event google.storage.object.finalize

For debugging purpose you can retrieve logs with gcloud command as well:

gcloud functions logs read --limit 50

mb, If you want to activate one or more paid buckets, follow the instructions under Creating a bucket to activate them. Note: When you create a default� Creating storage buckets Open the Cloud Storage browser in the Google Cloud Console. Open the Cloud Storage browser Click Create bucket to open the bucket creation form. Enter your bucket information and click Continue to complete each step: Specify a Name, subject to the bucket name requirements.

If someone is still looking for an answer, one needs to set the metadata while adding the blob. For those who want to update the metadata for all existing objects in the bucket, you can use setmeta from gsutil - https://cloud.google.com/storage/docs/gsutil/commands/setmeta

You just need to do the following :

gsutil setmeta -r -h "Cache-control:public, max-age=12345" gs://bucket_name

Quickstart: Using the Console | Cloud Storage, If you have never used gsutil on this instance before, use the gcloud tool to set up credentials. gcloud init. Alternatively, if your instance is� Making all objects in a bucket publicly readable Open the Cloud Storage browser in the Google Cloud Console. Open the Cloud Storage browser In the list of buckets, click on the name of the bucket that you want to make public. Select the Permissions tab near the top of the page. Click the Add members

There is no way to specify a default cache control. It must be set when creating the object.

Create a bucket | Cloud Storage, The Buckets resource represents a bucket in Google Cloud Storage. If set, access checks only use bucket-level IAM policies or above. Set a default region and zone. If you want to use the API examples in this guide, set up API access. Writing and reading data from Cloud Storage buckets. To write or read data from a bucket, you must have access to the bucket. Alternatively, you can read data from any bucket that is publicly accessible.

I know that this is quite an old question and you're after a default action (which I'm not sure exists), but the below worked for me on a recent PHP project after much frustration:

$object = $bucket->upload($tempFile, [
            'predefinedAcl' => "PUBLICREAD",
            'name' => $destination,
            'metadata' => [
                'cacheControl' => 'Cache-Control: private, max-age=0, no-transform',
            ]
        ]);

Same can be applied in node (untested though):

await storage.bucket(bucketName).upload(filename, {
      gzip: true,
      destination: destination,
      public: true,
      metadata: {
        cacheControl: "Cache-Control: private, max-age=0, no-transform"
      }
    });

Setting Up Google Cloud Storage, Whether the event_based_hold field for newly-created files in the bucket will be initially set to true . #default_kms_key ⇒ String. The Cloud KMS encryption key� It is possible to write a Google Cloud Storage Trigger.. This function sets the Cache-Control metadata field for every new object in a bucket: from google.cloud import storage CACHE_CONTROL = "private" def set_cache_control_private(data, context): """Background Cloud Function to be triggered by Cloud Storage.

If you're using a python app, you can use the option "default_expiration" in your app.yaml to set a global default value for the Cache-Control header: https://cloud.google.com/appengine/docs/standard/python/config/appref

For example:

runtime: python27   
api_version: 1   
threadsafe: yes

default_expiration: "30s"

Connecting to Cloud Storage buckets, This will not make an HTTP request; it simply instantiates a bucket object owned by this client. Parameters. bucket_name (str) – The name of the bucket to be� GCS Bucket: A google storage bucket where you want to save the terraform state. You can create one from here. Valid Google Service Account: Google service account with permissions to write to the storage bucket used by Terraform to save the states. GCS backend configuration has the following key-value pairs. Bucket: Google storage bucket name.

Buckets | Cloud Storage, Each command has a set of options that you can use to customize settings further . from a local folder to a bucket using default and non-default storage classes� Select “ new service account ” from the drop-down list. Add a name such as “ baeldung-cloud-storage ” into the account name field. Under “ role ” select Project, and then Owner in the submenu. Select create, and the console downloads a private key file.

Class: Google::Cloud::Storage::Bucket — Google, Bucket Lock is integrated into the Cloud Storage console, our APIs and gsutil, and it takes just a few steps to set up a retention policy for a bucket and store some data, as you can see here: Once expired or unneeded, it’s simple to remove a policy, as shown here:

Storage Client — google-cloud-storage documentation, Safely store and share your photos, videos, files and more in the cloud. Your first 15 GB of storage are free with a Google account.

Comments
  • Make sure the IAM user have enough permissions. E.g. "Role: Storage Object Admin" or what ever level suited for the occasion.
  • I wish Google would add this as a bucket option to set a global cache control for public items. The default of 1 hour is not always desired.