HTTP Triggering Cloud Function with Cloud Scheduler

google cloud scheduler
google cloud functions
google cloud scheduler pricing
cloud tasks vs cloud scheduler
cloud scheduler dataflow
schedule trigger cloud function
cloud scheduler vs cloud composer
schedule function firestore

I have a problem with a job in the Cloud Scheduler for my cloud function. I created the job with next parameters:

Target: HTTP

URL: my trigger url for cloud function

HTTP method: POST


 "expertsender": {
  "apiKey": "ExprtSender API key",
  "apiAddress": "ExpertSender APIv2 address",
  "date": "YYYY-MM-DD",
  "entities": [
        "entity": "Messages"
        "entity": "Activities",
 "bq": {
         "project_id": "YOUR GCP PROJECT",
         "dataset_id": "YOUR DATASET NAME",
         "location": "US"

The real values has been changed in this body.

When I run this job I got an error. The reason is caused by processing body from POST request.

However, when I take this body and use it as Triggering event in Testing I don't get any errors. So I think, that problem in body representation for my job but I havn't any idea how fix it. I'll be very happy for any idea.

Using Pub/Sub to trigger a Cloud Function, You can use Cloud Scheduler to securely trigger a Cloud Run service on a that invokes your service, you'll need to specify the HTTP method that matches this. Backup Essential Data To The Cloud. Unlimited Storage Options Look Now! Don't Risk Losing Your Data! Affordable Options Mean No Reason Not To Have A Backup.

Thank you @Dinesh for pointing towards the request headers as a solution! For all those who still wander and are lost, the code in python 3.7.4:

import json

raw_request_data =

# Luckily it's at least UTF-8 encoded...
string_request_data = raw_request_data.decode("utf-8")
request_json: dict = json.loads(string_request_data)

Totally agree, this is sub-par from a usability perspective. Having the testing utility pass a JSON and the cloud scheduler posting an "application/octet-stream" is incredibly irresponsibly designed. You should, however, create a request handler, if you want to invoke the function in a different way:

def request_handler(request):
    # This works if the request comes in from 
    #"cloud-function-etc", json={"key":"value"})
    # or if the Cloud Function test was used
    request_json = request.get_json()
    if request_json:
        return request_json

    # That's the hard way, i.e. Google Cloud Scheduler sending its JSON payload as octet-stream
    if not request_json and request.headers.get("Content-Type") == "application/octet-stream":
        raw_request_data =
        string_request_data = raw_request_data.decode("utf-8")
        request_json: dict = json.loads(string_request_data)

    if request_json:
        return request_json

    # Error code is obviously up to you
        return "500"

Running services on a schedule, Let's build a robo-caller 🤖 ☎️. Learn how to use the new GCP Cloud Scheduler to trigger Duration: 4:53 Posted: Nov 8, 2018 We will select HTTP i.e. ask the Cloud Scheduler to invoke a HTTP endpoint that we will specify next. This HTTP endpoint will be that of our HTTP Trigger based Cloud Function. URL: Provide the HTTPs endpoint of the Cloud Function i.e. function-1. HTTP method: This applies to HTTP invocations. You can choose from one of the HTTP methods like POST, GET, etc.

Another way to solve the problem is this:


It forces the parser to treat the payload as json, ingoring the Mimetype. Reference to the flask documentation is here

I think this is a bit more concise then the other solutions proposed.

Cloud Scheduler - Time Triggers for Cloud Functions, worker-instance-cloud-function . Let's ensure the Trigger is HTTP as we will call this HTTP URL using Cloud Scheduler. Next copy the URL  I have a problem with a job in the Cloud Scheduler for my cloud function. I created the job with next parameters: Target: HTTP URL: my trigger url for cloud function HTTP method: POST Body: { "

Using Cloud Scheduler and Cloud Functions to Deploy a Periodic , HTTP — invoke functions directly via HTTP requests; Google Cloud Storage How to Schedule (Cron) Jobs with Google Cloud Functions for  Using Pub/Sub to trigger a Cloud Function This tutorial shows you how to use Cloud Scheduler and Pub/Sub to trigger a Cloud Function. Being able to schedule the execution of a Cloud Function is a common use case for Cloud Scheduler.

Google Cloud Functions: Scheduled Trigger - Earl Gay, Monitor HTTP/S network requests · Monitor specific app code · Monitor custom Before you begin; Write a scheduled function; Deploy a scheduled function Cloud Pub/Sub topic and uses Google Cloud Scheduler to trigger events on that In Cloud Functions for Firebase, scheduling logic resides in your  You can invoke Cloud Functions with an HTTP request using the POST , PUT, GET, DELETE, and OPTIONS HTTP methods. To create an HTTP endpoint for your function, specify --trigger-http as the trigger type when deploying your function. From the caller's perspective, HTTP invocations are synchronous, meaning that the result of the function execution will be returned in the response to the HTTP request.

Schedule functions, Disclaimer: I have tried to solve the same issue using NodeJS and I'm able to get a solution. I understand that this is an old question. But I felt  Let's build a robo-caller 🤖 ☎️. Learn how to use the new GCP Cloud Scheduler to trigger Firebase Cloud Functions at specific times or intervals. https://angu

  • What is the error? What is the body content?
  • @DougStevenson The error occurred when I tried to get body data inside a cloud function (I use python). So, my function get the body but then i see in logs: 'NoneType' object is not subscriptable. It means, that function can't extract parameters in the body properly because smth wrong with the body. However, when i trigerring my function from Testing interface of the cloud function with the same body then no errors occurs. Body content I left at a question above.
  • You might want to edit your question to show your code and point out the line where the error occurs. You should also show what you expect the body content should be.
  • Can you also include the entire log message that contains " 'NoneType' object is not subscriptable"?
  • @SergeyKravchenko Are you able to update the question with more details?
  • Great work in hunting this down! Just a quick note for future visitors, while you can't set headers (yet) via Console, you CAN set headers if you create the Scheduler Job via gcloud (e.g. gcloud scheduler jobs create ... --headers Content-Type=application/json ...). The relevant docs are currently buried in a modal on this page.
  • @ChadKruse This is real good info. I have tried your solution and it works as well.…