Does Pipedream Support Integration with Observability Platforms to Trace All Request/Response Packages?

This topic was automatically generated from Slack. You can find the original thread here.

Howdy folks, I’m working for a client on building out a set of integrations from their ERP system to external cloud services. I’ve got some working demos up and running using pipedream driven by REST POSTs from their system. My handlers will be interacting with a variety of services, including sending populated data bodies back in the responses. Does pipedream integrate with any observability platforms where I can essentially follow full traces and see all the request / response packages from start to finish? For example, if I have a handler that purchases a shipment label from UPS and gets it printed on a label printer, there will be a variety of transactions involved in the happy path. I want to use something like hyperdx, honeycomb, datadog, or some other observability platform to see the trace of the whole session. Is that supported in any way?

Hi Chris, great question, this is a popular ask - especially with a native integration into DataDog specifically.

You can still integrate with tracing services at the code level yourself, or simply choose to log only exceptions.

We offer special pub/sub channels to listen to all logs and/or all errors:

Here’s a short video that shows you how to integrate one of these channels into CloudWatch:

Yes, I realize that my handlers can integrate, but I want the original requests to be part of the observable data.

Got it, then you can create a custom source that wraps a standard HTTP webhook with you tracing:

can you show me a basic source component that exposes an HTTP webhook over a $.interface.http and logs each request data to AWS CloudWatch?

Certainly! Below is an example of a Pipedream source component that exposes an HTTP webhook using $.interface.http and logs each request data to AWS CloudWatch. This component uses the AWS SDK to interact with CloudWatch. Please ensure you have connected your AWS account to Pipedream and have the necessary permissions to write logs to CloudWatch. ```
import AWS from ‘aws-sdk’; import { defineComponent } from “pipedream/platform”; export default defineComponent({ props: { aws: { type: “app”, app: “aws”, }, http: { type: “$.interface.http”, customResponse: true, }, }, async run({ steps, $ }) { const { accessKeyId, secretAccessKey } = this.aws.$auth; const cloudwatchlogs = new AWS.CloudWatchLogs({ accessKeyId, secretAccessKey, region: ‘us-east-1’, // Specify the AWS region, e.g., ‘us-east-1’ }); const logGroupName = ‘/aws/pipedream/http-requests’; // Specify your CloudWatch Log Group name const logStreamName = ‘http-requests-stream’; // Specify your CloudWatch Log Stream name // Prepare the log event const logEvents = [ { timestamp: new Date().getTime(), // Current time in milliseconds message: JSON.stringify(steps.trigger.event.body), // Log the request body }, ]; // Parameters for putLogEvents const params = { logGroupName, logStreamName, logEvents, }; // Use the CloudWatchLogs putLogEvents API to log the request data cloudwatchlogs.putLogEvents(params, function(err, data) { if (err) console.log(err, err.stack); // an error occurred else console.log(data); // successful response }); // Respond to the HTTP request this.http.respond({ status: 200, body: { message: “Request logged to CloudWatch successfully”, }, }); }, });

thanks, and can you show me how I can deploy this source to my Pipedream workspace?

To deploy a source to your Pipedream workspace, you can follow these steps: 1. Visit Pipedream Sources to find and deploy curated components directly from the UI. 2. If you have a specific source code you want to deploy, you can do so by navigating to the appropriate URL pattern as mentioned in the Pipedream documentation. Unfortunately, without direct access to the code or a more specific description of the source you’re trying to deploy, I can’t provide a tailored URL. However, the general approach to deploying code via the UI is outlined in the Pipedream Components API documentation. If you’re deploying a custom component (such as a trigger or an action) that you’ve developed, ensure you’ve followed the guidelines for authoring components, including testing your component locally and adhering to Pipedream’s component structure and API. For any further assistance or if you have specific questions about your source code, feel free to join the Pipedream public Slack for community and support team insights.

Eh that answer isn’t the best, but that guide I linked to above shows how to deploy a custom source (a.k.a. trigger) to your account.

Hmmm, I really want to use opentelemetry as my trace collector. JavaScript | OpenTelemetry

Then I will have observability platform options other than cloudwatch.

Sure, that was just an example. You can use the OpenTelemetry npm packages instead and log your requests that way.

Can you me a basic component that builds a custom http handler and uses the opentelemetry libraries to collect trace data and send it to honeycomb.io?

To create a basic Pipedream component that acts as a custom HTTP handler and uses the OpenTelemetry libraries to collect trace data and send it to Honeycomb.io, you’ll need to follow these steps. Since Pipedream components are Node.js modules and OpenTelemetry has Node.js libraries, this example will be in Node.js. Please note that direct integration with OpenTelemetry libraries might require additional setup outside of Pipedream’s environment, but this example will give you a starting point. First, ensure you have an account on Honeycomb.io and retrieve your write key. This example assumes you have that information ready. ``` import { axios } from “@pipedream/platform”; // Import OpenTelemetry libraries import { trace } from ‘@opentelemetry/api’; import { NodeTracerProvider } from ‘@opentelemetry/node’; import { SimpleSpanProcessor } from ‘@opentelemetry/tracing’; import { HoneycombExporter } from ‘@opentelemetry/exporter-honeycomb’; export default defineComponent({ async run({ steps, $ }) { // Initialize the OpenTelemetry Tracer const provider = new NodeTracerProvider(); const exporter = new HoneycombExporter({ // Replace YOUR_HONEYCOMB_WRITE_KEY with your actual Honeycomb write key apiKey: ‘YOUR_HONEYCOMB_WRITE_KEY’, // Replace YOUR_DATASET_NAME with your actual dataset name on Honeycomb dataset: ‘YOUR_DATASET_NAME’ }); provider.addSpanProcessor(new SimpleSpanProcessor(exporter)); provider.register(); // Get a tracer const tracer = trace.getTracer(‘example-tracer’); // Start a span const span = tracer.startSpan(‘http-handler’); try { // Your HTTP handler logic here // For example, making an HTTP request using axios const response = await axios($, { url: “https://example.com”, method: “GET”, }); // You can add attributes to the span based on the request/response span.setAttribute(“http.status_code”, response.status); span.setAttribute(“http.url”, “https://example.com”); // Return the response data return response; } catch (error) { // Set error attributes on the span span.setAttribute(“error”, true); span.setAttribute(“error.message”, error.message); throw error; } finally { // End

This example looks like it would trace the requests that happen after the initial incoming request to pipedream. I want the requests and responses to my webhook handler in pipedream to also be sent to honeycomb.io.

To send requests and responses from your Pipedream webhook handler to Honeycomb.io, you can modify the webhook handler to include an HTTP request to the Honeycomb.io API. This involves capturing the request and response data within your Pipedream workflow and then forwarding this data to Honeycomb.io. Here’s an example of how you might achieve this using a Pipedream HTTP source and a Node.js code step to send the data to Honeycomb.io: 1. Create an HTTP source in Pipedream. This will serve as your webhook handler. You can find instructions on setting up an HTTP source in the Pipedream documentation: HTTP sources. 2. Capture the request and response data. The HTTP source automatically captures incoming requests. You can access this data in subsequent steps of your workflow. 3. Forward the data to Honeycomb.io. Add a Node.js code step after your HTTP source trigger to send the captured data to Honeycomb.io. You’ll need to use the axios library to make an HTTP request to the Honeycomb.io API. Here’s an example Node.js code step that sends data to Honeycomb.io: import { axios } from "@pipedream/platform"; export default defineComponent({ async run({ steps, $ }) { const honeycombApiKey = 'YOUR_HONEYCOMB_API_KEY'; // Replace with your Honeycomb API key const datasetName = 'YOUR_DATASET_NAME'; // Replace with your Honeycomb dataset name // Example payload - customize this based on the data you want to send const payload = { data: { request: steps.trigger.event.body, // Example: capturing the request body // Add more data here as needed }, }; // Sending the data to [Honeycomb.io](http://Honeycomb.io) await axios($, { method: "POST", url: `https://api.honeycomb.io/1/events/${datasetName}`, headers: { "X-Honeycomb-Team": honeycombApiKey, "Content-Type": "application/json", }, data: payload, }); }, }); Please replace 'YOUR_HONEYCOMB_API_KEY' and 'YOUR_DATASET_NAME' with your actual Honeycomb API key and dataset name. Customize the payload object to include the request and response data you want to send to