← Google Cloud

Bigquery Insert Rows with Google Cloud API

Pipedream makes it easy to connect APIs for Google Cloud and 1000+ other apps remarkably fast.

Trigger workflow on
HTTP requests, schedules and app events
Next, do this
Bigquery Insert Rows with the Google Cloud API
No credit card required
Into to Pipedream
Watch us build a workflow
Watch us build a workflow
7 min
Watch now ➜

Trusted by 500,000+ developers from startups to Fortune 500 companies

Adyen logo
Brex logo
Carta logo
Checkr logo
Chameleon logo
DevRev logo
LinkedIn logo
Netflix logo
New Relic logo
OnDeck logo
Replicated logo
Scale AI logo
Teamwork logo
Warner Bros. logo
Xendit logo

Developers Pipedream

Getting Started

Create a workflow to Bigquery Insert Rows with the Google Cloud API. When you configure and deploy the workflow, it will run on Pipedream's servers 24x7 for free.

  1. Configure the Bigquery Insert Rows action
    1. Connect your Google Cloud account
    2. Select a Dataset ID
    3. Select a Table Name
    4. Configure Rows
  2. Select a trigger to run your workflow on HTTP requests, schedules or app events
  3. Deploy the workflow
  4. Send a test event to validate your setup
  5. Turn on the trigger

Integrations

Bigquery Insert Rows with Google Cloud API on New Requests (Payload Only) from HTTP / Webhook API
HTTP / Webhook + Google Cloud
 
Try it
Bigquery Insert Rows with Google Cloud API on New Message from Discord API
Discord + Google Cloud
 
Try it
Bigquery Insert Rows with Google Cloud API on New Message In Channels from Slack API
Slack + Google Cloud
 
Try it
Bigquery Insert Rows with Google Cloud API on New Message in Channel from Discord Bot API
Discord Bot + Google Cloud
 
Try it
Bigquery Insert Rows with Google Cloud API on New Submission from Typeform API
Typeform + Google Cloud
 
Try it

Details

This is a pre-built, source-available component from Pipedream's GitHub repo. The component is developed by Pipedream and the community, and verified and maintained by Pipedream.

To contribute an update to an existing component or create a new component, create a PR on GitHub. If you're new to Pipedream component development, you can start with quickstarts for trigger span and action development, and then review the component API reference.

Bigquery Insert Rows on Google Cloud
Description:Inserts rows into a BigQuery table. [See the docs](https://github.com/googleapis/nodejs-bigquery) and for an example [here](https://github.com/googleapis/nodejs-bigquery/blob/main/samples/insertRowsAsStream.js).
Version:0.0.1
Key:google_cloud-bigquery-insert-rows

Code

import googleCloud from "../../google_cloud.app.mjs";
import utils from "../../common/utils.mjs";

export default {
  name: "Bigquery Insert Rows",
  version: "0.0.1",
  key: "google_cloud-bigquery-insert-rows",
  type: "action",
  description: "Inserts rows into a BigQuery table. [See the docs](https://github.com/googleapis/nodejs-bigquery) and for an example [here](https://github.com/googleapis/nodejs-bigquery/blob/main/samples/insertRowsAsStream.js).",
  props: {
    googleCloud,
    datasetId: {
      propDefinition: [
        googleCloud,
        "datasetId",
      ],
    },
    tableId: {
      propDefinition: [
        googleCloud,
        "tableId",
        ({ datasetId }) => ({
          datasetId,
        }),
      ],
    },
    rows: {
      type: "string[]",
      label: "Rows",
      description: "The rows to insert into the table. Each row is a JSON object with column names as keys and rows as values. E.g. `{\"name\": \"John\", \"age\": 20}`",
    },
  },
  async run({ $ }) {
    const {
      datasetId,
      tableId,
      rows,
    } = this;

    if (!Array.isArray(rows)) {
      throw new Error("Rows must be an array");
    }

    const rowsParsed = rows.map(utils.parse);

    try {
      const response = await this.googleCloud
        .getBigQueryClient()
        .dataset(datasetId)
        .table(tableId)
        .insert(rowsParsed);

      $.export("$summary", `Successfully inserted ${rows.length} rows into ${datasetId}.${tableId}`);

      return response;
    } catch (error) {
      console.log(JSON.stringify(error.errors, null, 2));
      throw error;
    }
  },
};

Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI and CLI.
LabelPropTypeDescription
Google CloudgoogleCloudappThis component uses the Google Cloud app.
Dataset IDdatasetIdstringSelect a value from the drop down menu.
Table NametableIdstringSelect a value from the drop down menu.
Rowsrowsstring[]

The rows to insert into the table. Each row is a JSON object with column names as keys and rows as values. E.g. {"name": "John", "age": 20}

Authentication

Google Cloud uses API keys for authentication. When you connect your Google Cloud account, Pipedream securely stores the keys so you can easily authenticate to Google Cloud APIs in both code and no-code steps.

  1. Create a service account in GCP and set the permissions you need for Pipedream workflows.

  2. Generate a service account key

  3. Download the key details in JSON format

  4. Open the JSON in a text editor, and copy and paste its contents here.

About Google Cloud

The Google Cloud Platform, including BigQuery

More Ways to Use Google Cloud

Triggers

New Pub/Sub Messages from the Google Cloud API

Emit new Pub/Sub topic in your GCP account. Messages published to this topic are emitted from the Pipedream source.

 
Try it
BigQuery - New Row from the Google Cloud API

Emit new events when a new row is added to a table

 
Try it
BigQuery - Query Results from the Google Cloud API

Emit new events with the results of an arbitrary query

 
Try it

Actions

Create Bucket with the Google Cloud API

Creates a bucket on Google Cloud Storage See the docs

 
Try it
Get Bucket Metadata with the Google Cloud API

Gets Google Cloud Storage bucket metadata. See the docs.

 
Try it
Get Object with the Google Cloud API

Downloads an object from a Google Cloud Storage bucket, See the docs

 
Try it
List Buckets with the Google Cloud API

List Google Cloud Storage buckets, See the docs

 
Try it
Logging - Write Log with the Google Cloud API

Writes log data to the Logging service, See the docs

 
Try it