← Pipedream + Google Cloud integrations

Create Scheduled Query with Google Cloud API on New Scheduled Tasks from Pipedream API

Pipedream makes it easy to connect APIs for Google Cloud, Pipedream and 2,400+ other apps remarkably fast.

Trigger workflow on
New Scheduled Tasks from the Pipedream API
Next, do this
Create Scheduled Query with the Google Cloud API
No credit card required
Intro to Pipedream
Watch us build a workflow
Watch us build a workflow
8 min
Watch now ➜

Trusted by 1,000,000+ developers from startups to Fortune 500 companies

Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo
Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo

Developers Pipedream

Getting Started

This integration creates a workflow with a Pipedream trigger and Google Cloud action. When you configure and deploy the workflow, it will run on Pipedream's servers 24x7 for free.

  1. Select this integration
  2. Configure the New Scheduled Tasks trigger
    1. Connect your Pipedream account
    2. Optional- Configure Secret
  3. Configure the Create Scheduled Query action
    1. Connect your Google Cloud account
    2. Select a Destination Dataset
    3. Optional- Select a Dataset Region
    4. Configure Display Name
    5. Configure Query
    6. Optional- Configure Schedule
    7. Optional- Select a Write Disposition
    8. Optional- Configure Destination Table Name Template
  4. Deploy the workflow
  5. Send a test event to validate your setup
  6. Turn on the trigger

Details

This integration uses pre-built, source-available components from Pipedream's GitHub repo. These components are developed by Pipedream and the community, and verified and maintained by Pipedream.

To contribute an update to an existing component or create a new component, create a PR on GitHub. If you're new to Pipedream component development, you can start with quickstarts for trigger span and action development, and then review the component API reference.

Trigger

Description:Exposes an HTTP API for scheduling messages to be emitted at a future time
Version:0.3.1
Key:pipedream-new-scheduled-tasks

Pipedream Overview

Pipedream is an API that allows you to build applications that can connect to
various data sources and processes them in real-time. You can use Pipedream to
create applications that can perform ETL (Extract, Transform, and Load) tasks,
as well as to create data-driven workflows.

Some examples of applications you can build using the Pipedream API include:

  • An application that can extract data from a database, transform it, and then
    load it into another database.
  • An application that can monitor a data source for changes, and then trigger a
    workflow in response to those changes.
  • An application that can poll an API for new data, and then process that data
    in real-time.

Trigger Code

import pipedream from "../../pipedream.app.mjs";
import sampleEmit from "./test-event.mjs";
import { uuid } from "uuidv4";

export default {
  key: "pipedream-new-scheduled-tasks",
  name: "New Scheduled Tasks",
  type: "source",
  description:
    "Exposes an HTTP API for scheduling messages to be emitted at a future time",
  version: "0.3.1",
  dedupe: "unique", // Dedupe on a UUID generated for every scheduled task
  props: {
    pipedream,
    secret: {
      type: "string",
      secret: true,
      label: "Secret",
      optional: true,
      description:
        "**Optional but recommended**: if you enter a secret here, you must pass this value in the `x-pd-secret` HTTP header when making requests",
    },
    http: {
      label: "Endpoint",
      description: "The endpoint where you'll send task scheduler requests",
      type: "$.interface.http",
      customResponse: true,
    },
    db: "$.service.db",
  },
  methods: {
    // To schedule future emits, we emit to the selfChannel of the component
    selfChannel() {
      return "self";
    },
    // Queue for future emits that haven't yet been delivered
    queuedEventsChannel() {
      return "$in";
    },
    httpRespond({
      status, body,
    }) {
      this.http.respond({
        headers: {
          "content-type": "application/json",
        },
        status,
        body,
      });
    },
    async selfSubscribe() {
      // Subscribe the component to itself. We do this here because even in
      // the activate hook, the component isn't available to take subscriptions.
      // Scheduled tasks are sent to the self channel, which emits the message at
      // the specified delivery_ts to this component.
      const isSubscribedToSelf = this.db.get("isSubscribedToSelf");
      if (!isSubscribedToSelf) {
        const componentId = process.env.PD_COMPONENT;
        const selfChannel = this.selfChannel();
        console.log(`Subscribing to ${selfChannel} channel for event source`);
        console.log(
          await this.pipedream.subscribe(componentId, componentId, selfChannel),
        );
        this.db.set("isSubscribedToSelf", true);
      }
    },
    validateEventBody(event, operation) {
      const errors = [];

      // Secrets are optional, so we first check if the user configured
      // a secret, then check its value against the prop (validation below)
      if (this.secret && event.headers["x-pd-secret"] !== this.secret) {
        errors.push(
          "Secret on incoming request doesn't match the configured secret",
        );
      }

      if (operation === "schedule") {
        const {
          timestamp,
          message,
        } = event.body;
        // timestamp should be an ISO 8601 string. Parse and check for validity below.
        const epoch = Date.parse(timestamp);

        if (!timestamp) {
          errors.push(
            "No timestamp included in payload. Please provide an ISO8601 timestamp in the 'timestamp' field",
          );
        }
        if (timestamp && !epoch) {
          errors.push("Timestamp isn't a valid ISO 8601 string");
        }
        if (!message) {
          errors.push("No message passed in payload");
        }
      }

      return errors;
    },
    scheduleTask(event) {
      const errors = this.validateEventBody(event, "schedule");
      let status, body;

      if (errors.length) {
        console.log(errors);
        status = 400;
        body = {
          errors,
        };
      } else {
        const id = this.emitScheduleEvent(event.body, event.body.timestamp);
        status = 200;
        body = {
          msg: "Successfully scheduled task",
          id,
        };
      }

      this.httpRespond({
        status,
        body,
      });
    },
    emitScheduleEvent(event, timestamp) {
      const selfChannel = this.selfChannel();
      const epoch = Date.parse(timestamp);
      const $id = uuid();

      console.log(`Scheduled event to emit on: ${new Date(epoch)}`);

      this.$emit(
        {
          ...event,
          $channel: selfChannel,
          $id,
        },
        {
          name: selfChannel,
          id: $id,
          delivery_ts: epoch,
        },
      );

      return $id;
    },
    async cancelTask(event) {
      const errors = this.validateEventBody(event, "cancel");
      let status, msg;

      if (errors.length) {
        console.log(errors);
        status = 400;
        msg = "Secret on incoming request doesn't match the configured secret";
      } else {
        try {
          const id = event.body.id;
          const isCanceled = await this.deleteEvent(event);
          if (isCanceled) {
            status = 200;
            msg = `Cancelled scheduled task for event ${id}`;
          } else {
            status = 404;
            msg = `No event with ${id} found`;
          }
        } catch (error) {
          console.log(error);
          status = 500;
          msg = "Failed to schedule task. Please see the logs";
        }
      }

      this.httpRespond({
        status,
        body: {
          msg,
        },
      });
    },
    async deleteEvent(event) {
      const componentId = process.env.PD_COMPONENT;
      const inChannel = this.queuedEventsChannel();

      // The user must pass a scheduled event UUID they'd like to cancel
      // We lookup the event by ID and delete it
      const { id } = event.body;

      // List events in the $in channel - the queue of scheduled events, to be emitted in the future
      const events = await this.pipedream.listEvents(
        componentId,
        inChannel,
      );
      console.log("Events: ", events);

      // Find the event in the list by id
      const eventToCancel = events.data.find((e) => {
        const { metadata } = e;
        return metadata.id === id;
      });

      console.log("Event to cancel: ", eventToCancel);

      if (!eventToCancel) {
        console.log(`No event with ${id} found`);
        return false;
      }

      // Delete the event
      await this.pipedream.deleteEvent(
        componentId,
        eventToCancel.id,
        inChannel,
      );
      return true;
    },
    emitEvent(event, summary) {
      // Delete the channel name and id from the incoming event, which were used only as metadata
      const id = event.$id;
      delete event.$channel;
      delete event.$id;

      this.$emit(event, {
        summary: summary ?? JSON.stringify(event),
        id,
        ts: +new Date(),
      });
    },
  },
  async run(event) {
    await this.selfSubscribe();

    const { path } = event;
    if (path === "/schedule") {
      this.scheduleTask(event);
    } else if (path === "/cancel") {
      await this.cancelTask(event);
    } else if (event.$channel === this.selfChannel()) {
      this.emitEvent(event);
    }
  },
  sampleEmit,
};

Trigger Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI and CLI.
LabelPropTypeDescription
PipedreampipedreamappThis component uses the Pipedream app.
Secretsecretstring

Optional but recommended: if you enter a secret here, you must pass this value in the x-pd-secret HTTP header when making requests

N/Ahttp$.interface.httpThis component uses $.interface.http to generate a unique URL when the component is first instantiated. Each request to the URL will trigger the run() method of the component.
N/Adb$.service.dbThis component uses $.service.db to maintain state between executions.

Trigger Authentication

Pipedream uses API keys for authentication. When you connect your Pipedream account, Pipedream securely stores the keys so you can easily authenticate to Pipedream APIs in both code and no-code steps.

About Pipedream

Integration platform for developers

Action

Description:Creates a scheduled query in Google Cloud. [See the documentation](https://cloud.google.com/bigquery/docs/scheduling-queries)
Version:0.0.1
Key:google_cloud-create-scheduled-query

Google Cloud Overview

The Google Cloud API opens a world of possibilities for enhancing cloud operations and automating tasks. It empowers you to manage, scale, and fine-tune various services within the Google Cloud Platform (GCP) programmatically. With Pipedream, you can harness this power to create intricate workflows, trigger cloud functions based on events from other apps, manage resources, and analyze data, all in a serverless environment. The ability to interconnect GCP services with numerous other apps enriches automation, making it easier to synchronize data, streamline development workflows, and deploy applications efficiently.

Action Code

import { protos } from "@google-cloud/bigquery-data-transfer";
import googleCloud from "../../google_cloud.app.mjs";
import constants from "../../common/constants.mjs";
import regions from "../../common/regions.mjs";

const {
  CreateTransferConfigRequest,
  TransferConfig,
} = protos.google.cloud.bigquery.datatransfer.v1;

export default {
  key: "google_cloud-create-scheduled-query",
  name: "Create Scheduled Query",
  description: "Creates a scheduled query in Google Cloud. [See the documentation](https://cloud.google.com/bigquery/docs/scheduling-queries)",
  version: "0.0.1",
  type: "action",
  props: {
    googleCloud,
    destinationDatasetId: {
      label: "Destination Dataset",
      description: "The name of the dataset to create the table in. If the dataset does not exist, it will be created.",
      propDefinition: [
        googleCloud,
        "datasetId",
      ],
    },
    datasetRegion: {
      type: "string",
      label: "Dataset Region",
      description: "The geographic location where the dataset should reside. [See the documentation here](https://cloud.google.com/bigquery/docs/locations#specifying_your_location)",
      default: "us",
      options: regions,
      optional: true,
    },
    displayName: {
      type: "string",
      label: "Display Name",
      description: "The user-friendly display name for the transfer config.",
    },
    query: {
      type: "string",
      label: "Query",
      description: "The GoogleSQL query to execute. Eg. ``SELECT @run_time AS time, * FROM `my_dataset.my_table` LIMIT 1000``. [See the documentation here](https://cloud.google.com/bigquery/docs/scheduling-queries#query_string).",
    },
    schedule: {
      type: "string",
      label: "Schedule",
      description: "Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: `every 24 hours`, `1st,3rd monday of month 15:30`, `every wed,fri of jan,jun 13:15`, and `first sunday of quarter 00:00`. [See more explanation about the format here](https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format).",
      optional: true,
    },
    writeDisposition: {
      type: "string",
      label: "Write Disposition",
      description: "The write preference you select determines how your query results are written to an existing destination table. [See the documentation here](https://cloud.google.com/bigquery/docs/scheduling-queries#write_preference).",
      default: constants.WRITE_DISPOSITION.WRITE_TRUNCATE,
      options: Object.values(constants.WRITE_DISPOSITION),
      optional: true,
    },
    destinationTableNameTemplate: {
      type: "string",
      label: "Destination Table Name Template",
      description: "The destination table name template can contain template variables such as ``{run_date}`` or ``{run_time}``. [See the documentation here](https://cloud.google.com/bigquery/docs/scheduling-queries#templating-examples).",
      optional: true,
      default: "logs",
    },
  },
  methods: {
    getTransferConfig({
      query, writeDisposition, destinationTableNameTemplate, ...args
    } = {}) {
      return new TransferConfig({
        dataSourceId: constants.DATA_SOURCE_ID.SCHEDULED_QUERY,
        params: {
          fields: {
            query: {
              stringValue: query,
            },
            destination_table_name_template: {
              stringValue: destinationTableNameTemplate,
            },
            write_disposition: {
              stringValue: writeDisposition,
            },
          },
        },
        ...args,
      });
    },
    createTransferConfig(args = {}) {
      const {
        googleCloud,
        getTransferConfig,
      } = this;

      const {
        project_id: projectId,
        client_email: serviceAccountName,
      } = googleCloud.authKeyJson();

      const client = googleCloud.bigQueryDataTransferClient();
      const parent = client.projectPath(projectId);

      const request = new CreateTransferConfigRequest({
        parent,
        serviceAccountName,
        transferConfig: getTransferConfig(args),
      });

      return client.createTransferConfig(request);
    },
  },
  async run({ $ }) {
    const {
      createTransferConfig,
      ...props
    } = this;

    const [
      response,
    ] = await createTransferConfig(props);

    $.export("$summary", `Scheduled query created with name: \`${response.name}\``);

    return response;
  },
};

Action Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI.

LabelPropTypeDescription
Google CloudgoogleCloudappThis component uses the Google Cloud app.
Destination DatasetdestinationDatasetIdstringSelect a value from the drop down menu.
Dataset RegiondatasetRegionstringSelect a value from the drop down menu:useuropeus-east1us-east4us-east5us-east7us-central1us-central2us-west1us-west2us-west3us-west4us-south1asia-east1asia-northeast1asia-southeast1asia-south1asia-east2asia-northeast2asia-northeast3asia-southeast2asia-south2europe-north1europe-west1europe-west2europe-west3europe-west4europe-west6europe-central2europe-west9europe-west8europe-southwest1europe-west10europe-west12northamerica-northeast1northamerica-northeast2southamerica-east1southamerica-west1australia-southeast1australia-southeast2me-central2me-west1me-central1africa-south1aws-us-east-1aws-us-west-2aws-ap-northeast-2aws-eu-west-1azure-eastus2
Display NamedisplayNamestring

The user-friendly display name for the transfer config.

Queryquerystring

The GoogleSQL query to execute. Eg. SELECT @run_time AS time, * FROM `my_dataset.my_table` LIMIT 1000. See the documentation here.

Scheduleschedulestring

Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: every 24 hours, 1st,3rd monday of month 15:30, every wed,fri of jan,jun 13:15, and first sunday of quarter 00:00. See more explanation about the format here.

Write DispositionwriteDispositionstringSelect a value from the drop down menu:WRITE_TRUNCATEWRITE_APPEND
Destination Table Name TemplatedestinationTableNameTemplatestring

The destination table name template can contain template variables such as {run_date} or {run_time}. See the documentation here.

Action Authentication

Google Cloud uses API keys for authentication. When you connect your Google Cloud account, Pipedream securely stores the keys so you can easily authenticate to Google Cloud APIs in both code and no-code steps.

  1. Create a service account in GCP and set the permissions you need for Pipedream workflows.
  2. Generate a service account key
  3. Download the key details in JSON format
  4. Upload the key below.

About Google Cloud

The Google Cloud Platform, including BigQuery

More Ways to Connect Google Cloud + Pipedream

Create a Subscription with Pipedream API on BigQuery - New Row from Google Cloud API
Google Cloud + Pipedream
 
Try it
Create a Subscription with Pipedream API on BigQuery - Query Results from Google Cloud API
Google Cloud + Pipedream
 
Try it
Create a Subscription with Pipedream API on New Pub/Sub Messages from Google Cloud API
Google Cloud + Pipedream
 
Try it
Delete a Subscription with Pipedream API on BigQuery - New Row from Google Cloud API
Google Cloud + Pipedream
 
Try it
Delete a Subscription with Pipedream API on BigQuery - Query Results from Google Cloud API
Google Cloud + Pipedream
 
Try it
New Scheduled Tasks from the Pipedream API

Exposes an HTTP API for scheduling messages to be emitted at a future time

 
Try it
New Upcoming Event Alert from the Pipedream API

Emit new event based on a time interval before an upcoming event in the calendar. This source uses Pipedream's Task Scheduler. See the documentation for more information and instructions for connecting your Pipedream account.

 
Try it
Card Due Date Reminder from the Pipedream API

Emit new event at a specified time before a card is due.

 
Try it
New Upcoming Calendar Event from the Pipedream API

Emit new event when a Calendar event is upcoming, this source is using reminderMinutesBeforeStart property of the event to determine the time it should emit.

 
Try it
New Pub/Sub Messages from the Google Cloud API

Emit new Pub/Sub topic in your GCP account. Messages published to this topic are emitted from the Pipedream source.

 
Try it
Create a Subscription with the Pipedream API

Create a Subscription. See Doc

 
Try it
Delete a Subscription with the Pipedream API

Delete a Subscription. See Doc

 
Try it
Generate Component Code with the Pipedream API

Generate component code using AI.

 
Try it
Get Component with the Pipedream API

Get info for a published component. See docs

 
Try it
Bigquery Insert Rows with the Google Cloud API

Inserts rows into a BigQuery table. See the docs and for an example here.

 
Try it

Explore Other Apps

1
-
24
of
2,400+
apps by most popular

HTTP / Webhook
HTTP / Webhook
Get a unique URL where you can send HTTP or webhook requests
Node
Node
Anything you can do with Node.js, you can do in a Pipedream workflow. This includes using most of npm's 400,000+ packages.
Python
Python
Anything you can do in Python can be done in a Pipedream Workflow. This includes using any of the 350,000+ PyPi packages available in your Python powered workflows.
OpenAI (ChatGPT)
OpenAI (ChatGPT)
OpenAI is an AI research and deployment company with the mission to ensure that artificial general intelligence benefits all of humanity. They are the makers of popular models like ChatGPT, DALL-E, and Whisper.
Premium
Salesforce
Salesforce
Web services API for interacting with Salesforce
Premium
HubSpot
HubSpot
HubSpot's CRM platform contains the marketing, sales, service, operations, and website-building software you need to grow your business.
Premium
Zoho CRM
Zoho CRM
Zoho CRM is an online Sales CRM software that manages your sales, marketing, and support in one CRM platform.
Premium
Stripe
Stripe
Stripe powers online and in-person payment processing and financial solutions for businesses of all sizes.
Shopify
Shopify
Shopify is a complete commerce platform that lets anyone start, manage, and grow a business. You can use Shopify to build an online store, manage sales, market to customers, and accept payments in digital and physical locations.
Premium
WooCommerce
WooCommerce
WooCommerce is the open-source ecommerce platform for WordPress.
Premium
Snowflake
Snowflake
A data warehouse built for the cloud
Premium
MongoDB
MongoDB
MongoDB is an open source NoSQL database management program.
Supabase
Supabase
Supabase is an open source Firebase alternative.
MySQL
MySQL
MySQL is an open-source relational database management system.
PostgreSQL
PostgreSQL
PostgreSQL is a free and open-source relational database management system emphasizing extensibility and SQL compliance.
Premium
AWS
AWS
Amazon Web Services (AWS) offers reliable, scalable, and inexpensive cloud computing services.
Premium
Twilio SendGrid
Twilio SendGrid
Send marketing and transactional email through the Twilio SendGrid platform with the Email API, proprietary mail transfer agent, and infrastructure for scalable delivery.
Amazon SES
Amazon SES
Amazon SES is a cloud-based email service provider that can integrate into any application for high volume email automation
Premium
Klaviyo
Klaviyo
Email Marketing and SMS Marketing Platform
Premium
Zendesk
Zendesk
Zendesk is award-winning customer service software trusted by 200K+ customers. Make customers happy via text, mobile, phone, email, live chat, social media.
Notion
Notion
Notion is a new tool that blends your everyday work apps into one. It's the all-in-one workspace for you and your team.
Slack
Slack
Slack is a channel-based messaging platform. With Slack, people can work together more effectively, connect all their software tools and services, and find the information they need to do their best work — all within a secure, enterprise-grade environment.
Microsoft Teams
Microsoft Teams
Microsoft Teams has communities, events, chats, channels, meetings, storage, tasks, and calendars in one place.
Schedule
Schedule
Trigger workflows on an interval or cron schedule.