← AWS

New Records Returned by CloudWatch Logs Insights Query from AWS API

Pipedream makes it easy to connect APIs for AWS and + other apps remarkably fast.

Trigger workflow on
New Records Returned by CloudWatch Logs Insights Query from the AWS API
Next, do this
Connect to 500+ APIs using code and no-code building blocks
No credit card required
Trusted by 200,000+ developers from startups to Fortune 500 companies:
Trusted by 200,000+ developers from startups to Fortune 500 companies

Developers Pipedream

Getting Started

Trigger a workflow on New Records Returned by CloudWatch Logs Insights Query with AWS API. When you configure and deploy the workflow, it will run on Pipedream's servers 24x7 for free.

  1. Configure the New Records Returned by CloudWatch Logs Insights Query trigger
    1. Connect your AWS account
    2. Select a AWS Region
    3. Select one or more CloudWatch Log Groups
    4. Configure Logs Insights Query
    5. Optional- Configure Emit query results as a single event
    6. Configure timer
  2. Add steps to connect to + APIs using code and no-code building blocks
  3. Deploy the workflow
  4. Send a test event to validate your setup
  5. Turn on the trigger

Details

This is a pre-built, open source component from Pipedream's GitHub repo. The component is developed by Pipedream and the community, and verified and maintained by Pipedream.

To contribute an update to an existing component or create a new component, create a PR on GitHub. If you're new to Pipedream component development, you can start with quickstarts for trigger span and action development, and then review the component API reference.

New Records Returned by CloudWatch Logs Insights Query on AWS
Description:Executes a CloudWatch Logs Insights query on a schedule, and emits the records as invidual events (default) or in batch
Version:0.0.3
Key:aws-new-records-returned-by-cloudwatch-logs-insights-query

Code

const aws = require("../../aws.app.js");

module.exports = {
  key: "aws-new-records-returned-by-cloudwatch-logs-insights-query",
  name: "New Records Returned by CloudWatch Logs Insights Query",
  description:
    "Executes a CloudWatch Logs Insights query on a schedule, and emits the records as invidual events (default) or in batch",
  version: "0.0.3",
  props: {
    aws,
    region: {
      propDefinition: [
        aws,
        "region",
      ],
    },
    db: "$.service.db",
    logGroupNames: {
      label: "CloudWatch Log Groups",
      description: "The log groups you'd like to query",
      type: "string[]",
      async options({ prevContext }) {
        const prevToken = prevContext.nextToken;
        const {
          logGroups,
          nextToken,
        } = await this.aws.logsInsightsDescibeLogGroups(this.region, prevToken);
        const options = logGroups.map((group) => {
          return {
            label: group.logGroupName,
            value: group.logGroupName,
          };
        });
        return {
          options,
          context: {
            nextToken,
          },
        };
      },
    },
    queryString: {
      label: "Logs Insights Query",
      description:
        "The query you'd like to run. See [this AWS doc](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_QuerySyntax.html) for help with query syntax",
      type: "string",
    },
    emitResultsInBatch: {
      type: "boolean",
      label: "Emit query results as a single event",
      description:
        "If `true`, all events are emitted as an array, within a single Pipedream event. If `false`, each row of results is emitted as its own event. Defaults to `true`",
      optional: true,
      default: true,
    },
    timer: {
      type: "$.interface.timer",
      default: {
        intervalSeconds: 5 * 60,
      },
    },
  },
  async run() {
    const now = +new Date();
    const startTime = this.db.get("startTime") || now - 1000 * 60 * 60;

    const AWS = this.aws.sdk(this.region);
    const cloudwatchlogs = new AWS.CloudWatchLogs();

    // First, start the query
    const params = {
      queryString: this.queryString,
      startTime,
      endTime: now,
      logGroupNames: this.logGroupNames,
    };

    const { queryId } = await cloudwatchlogs.startQuery(params).promise();

    // Then poll for its status, emitting each record as its own event when completed
    async function sleep(ms) {
      return new Promise((r) => setTimeout(r, ms));
    }

    let result, res;
    do {
      await sleep(1000);
      res = await cloudwatchlogs.getQueryResults({
        queryId,
      }).promise();
      result = res.status;
    } while (result === "Running" || result === "Scheduled");

    if (result !== "Complete") {
      throw new Error(`Query ${queryId} failed with status ${result}`);
    }

    console.log(JSON.stringify(res, null, 2));
    const { results } = res;

    if (!results || !results.length) {
      console.log("No results, exiting");
      this.db.set("startTime", now);
      return;
    }

    if (this.emitResultsInBatch === true) {
      this.$emit(results, {
        summary: JSON.stringify(results),
      });
    } else {
      for (const item of results) {
        this.$emit(item, {
          summary: JSON.stringify(item),
        });
      }
    }

    // The next time this source runs, query for data starting now
    this.db.set("startTime", now);
  },
};

Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI and CLI.
LabelPropTypeDescription
AWSawsappThis component uses the AWS app.
AWS RegionregionstringSelect a value from the drop down menu.
N/Adb$.service.dbThis component uses $.service.db to maintain state between component invocations.
CloudWatch Log GroupslogGroupNamesstring[]Select a value from the drop down menu.
Logs Insights QueryqueryStringstring

The query you'd like to run. See this AWS doc for help with query syntax

Emit query results as a single eventemitResultsInBatchboolean

If true, all events are emitted as an array, within a single Pipedream event. If false, each row of results is emitted as its own event. Defaults to true

timer$.interface.timer

Authentication

AWS uses API keys for authentication. When you connect your AWS account, Pipedream securely stores the keys so you can easily authenticate to AWS APIs in both code and no-code steps.

Follow the AWS Instructions for creating an IAM user with an associated access and secret key.

As a best practice, attach the minimum set of IAM permissions necessary to perform the specific task in Pipedream. If your workflow only needs to perform a single API call, you should create a user and associate an IAM group / policy with permission to do only that task. You can create as many linked AWS accounts in Pipedream as you'd like.

Once done, enter your access and secret key below.

About AWS

On-demand cloud computing platforms and APIs

About Pipedream

Stop writing boilerplate code, struggling with authentication and managing infrastructure. Start connecting APIs with code-level control when you need it — and no code when you don't.

Into to Pipedream
Watch us build a workflow
Watch us build a workflow
4 min
Watch now ➜
"The past few weeks, I truly feel like the clichéd 10x engineer."
@heyellieday
Powerful features that scale
Manage concurrency and execution rate
Manage concurrency and execution rate

Queue up to 10,000 events per workfow and manage the concurrency and rate at which workflows are triggered.

Process large payloads up to 5 terabytes
Process large payloads up to 5 terabytes

Large file support enables you to trigger workflows with any data (e.g., large JSON files, images and videos) up to 5 terabytes.

Return custom responses to HTTP requests
Return custom responses to HTTP requests

Return any JSON-serializable response from an HTTP triggered workflow using $respond().

Use most npm packages
Use most npm packages

To use any npm package, just require() it -- there's no npm install or package.json required.

Maintain state between executions
Maintain state between executions

Use $checkpoint to save state in one workflow invocation and read it the next time your workflow runs.

Pass data between steps
Pass data between steps

Return data from any step to inspect it in a human-friendly way and reference the data in future steps via the steps object.