steps.trigger.event
objectsteps.trigger.event
.
The shape of the event is specific to the source. For example, RSS sources produce events with a url
and title
property representing the data provided by new items from a feed. Google Calendar sources produce events with a meeting title, start date, etc.
https://endpoint.yourdomain.com
instead of the default `` domain.event
object, accessible in any code or action step.
GET
, POST
, HEAD
, and more.
We default to generating HTTPS URLs in the UI, but will accept HTTP requests against the same endpoint URL.
You can send data to any path on this host, with any query string parameters. You can access the full URL in the event
object if you’d like to write code that interprets requests with different URLs differently.
You can send data of any Media Type in the body of your request.
The primary limit we impose is on the size of the request body: we’ll issue a 413 Payload Too Large
status when the body exceeds our specified limit.
Bearer
token in the Authorization
header:
Bearer
token in the Authorization
header:
jsonwebtoken
package at the start of your workflow.
endpoint.yourdomain.com
instead of the default *.m.pipedream.net
domain, see the custom domains docs.
steps
object.
In the Inspector, we present steps.trigger.event
cleanly, indenting nested properties, to make the payload easy to read. Since steps.trigger.event
is a JavaScript object, it’s easy to reference and manipulate properties of the payload using dot-notation.
multipart/form-data
Content-Type
of multipart/form-data
, Pipedream parses the payload and converts it to a JavaScript object with a property per form field. For example, if you send a request with two fields:
event.body
, with the following shape:
steps.trigger.event.headers
steps export in your downstream steps.
Pipedream will automatically lowercase header keys for consistency.
x-pd-nostore
1
to prevent logging any data for this execution. Pipedream will execute all steps of the workflow, but no data will be logged to Pipedream. No event will show up in the inspector or the Event History UI.
If you need to disable logging for all requests, use the workflow’s Data Retention controls, instead.
x-pd-notrigger
1
to send an event to the workflow for testing. Pipedream will not trigger the production version of the workflow, but will display the event in the list of test events on the HTTP trigger.
pipedream_upload_body=1
query string or an x-pd-upload-body: 1
HTTP header in your request.
steps.trigger.event.body.raw_body_url
.
/tmp
directory.
steps.retrieve_large_payload.$return_value
:
/tmp
directory/tmp
directory.
multipart/form-data
HTTP request with the file as one of the form parts. Pipedream saves that file to a Pipedream-owned Amazon S3 bucket, generating a signed URL that allows you to access to that file for up to 30 minutes. After 30 minutes, the signed URL will be invalidated, and the file will be deleted.
In workflows, these file URLs are provided in the steps.trigger.event.body
variable, so you can download the file using the URL within your workflow, or pass the URL on to another third-party system for it to process.
/tmp
directory.
cURL
cURL
:
-F
tells cURL
we’re sending form data, with a single “part”: a field named image
, with the content of the image as the value (the @
allows cURL
to reference a local file).
When you send this image to a workflow, Pipedream parses the form data and converts it to a JavaScript object, event.body
. Select the event from the inspector, and you’ll see the image
property under event.body
:
image
property of event.body
, you’ll see the value of this URL in the url
property, along with the filename
and mimetype
of the file. Within your workflow, you can download the file, or pass the URL to a third party system to handle, and more.
/tmp
directoryimage
field in the form request, saving it to the /tmp
directory.
Content-Type
of multipart/form-data
, the limits that apply to form data also apply here.
The content of the file itself does not contribute to the HTTP payload limit imposed for forms. You can upload files up to 5TB in size. However, files that large may trigger other Pipedream limits. Please reach out with any specific questions or issues.
OPTIONS
requests:
200 OK
status code with the following payload:
$.respond()
function in a Code or Action step.$.respond()
and will always respond immediately when called in your workflow. A response error will still occur if your workflow throws an Error before this action runs.
$.respond()
$.respond()
function.
$.respond()
takes a single argument: an object with properties that specify the body, headers, and HTTP status code you’d like to respond with:
body
property can be either a string, object, a Buffer (binary data), or a Readable stream. Attempting to return any other data may yield an error.
In the case where you return a Readable stream:
await
the $.respond
function (await $.respond({ ... }
)immediate: true
property of $.respond
.$.respond()
execution$.respond()
from your HTTP client. By default, $.respond()
is called at the end of your workflow, after all other code is done executing, so it may take some time to issue the response back.
If you need to issue an HTTP response in the middle of a workflow, see the section on returning a response immediately.
immediate
property to true
:
immediate: true
tells $.respond()
to issue a response back to the client at this point in the workflow. After the HTTP response has been issued, the remaining code in your workflow runs.
This can be helpful, for example, when you’re building a Slack bot. When you send a message to a bot, Slack requires a 200 OK
response be issued immediately, to confirm receipt:
immediate: true
and run code after the HTTP response is issued.
$.respond()
in a workflow, you must always make sure $.respond()
is called in your code. If you make an HTTP request to a workflow, and run code where $.respond()
is not called, your endpoint URL will issue a 400 Bad Request
error with the following body:
$.respond()
conditionally, where it does not run under certain conditions.$.respond()
.body
property that isn’t a string, object, or Buffer.400 Bad Request
error in the application calling your workflow, you can implement try
/ finally
logic to ensure $.respond()
always gets called with some default message. For example:
413 Payload Too Large
status code when the body of your request exceeds .
In this case, the request will still appear in the inspector, with information on the error.
eniqtww30717
in eniqtww30717.m.pipedream.net
. If you attempt to send a request to an endpoint that does not exist, we’ll return a 404 Not Found
error.
We’ll also issue a 404 response on workflows with an HTTP trigger that have been disabled.
429 Too Many Requests
response. Review our limits to understand the conditions where you might be throttled.
You can also reach out to inquire about raising this rate limit.
If you control the application sending requests, you should implement a backoff strategy to temporarily slow the rate of events.
0 0 * * *
will run the job every day at midnight. Cron expressions can be tied to any timezone.console.log()
or other functions that print output, you should see the logs appear directly below the step where the code ran.
Actions and Destinations also show execution details relevant to the specific Action or Destination. For example, when you use the HTTP Destination to make an HTTP request, you’ll see the HTTP request and response details tied to that Destination step:
steps.trigger.event
variable that you can access within your workflow. This transformation can take a few seconds to perform. Once done, Pipedream will immediately trigger your workflow with the transformed payload.
Read more about the shape of the email trigger event.
{EMAIL_PAYLOAD_SIZE_LIMIT}
in size by sending emails to [YOUR EMAIL ENDPOINT]@upload.pipedream.net
. If your workflow-specific email address is endpoint@pipedream.net
, your “large email address” is endpoint@upload.pipedream.net
.
Emails delivered to this address are uploaded to a private URL you have access to within your workflow, at the variable steps.trigger.event.mail.content_url
. You can download and parse the email within your workflow using that URL. This content contains the raw email. Unlike the standard email interface, you must parse this email on your own - see the examples below.
mailparser
library:
/tmp
directory, read it and parse it/tmp
directory. Then it reads the same file (as an example), and parses it using the mailparser
library:
steps.trigger.event.attachments
, which provides an array of attachment objects. Each attachment in that array exposes key properties:
contentUrl
: a URL that hosts your attachment. You can download this file to the /tmp
directory and process it in your workflow.content
: If the attachment contains text-based content, Pipedream renders the attachment in content
, up to 10,000 bytes.contentTruncated
: true
if the attachment contained text-based content larger than 10,000 bytes. If true
, the data in content
will be truncated, and you should fetch the full attachment from contentUrl
.+data
+
sign to the incoming email key, followed by any arbitrary string:
+
sign. Sending an email to both of these addresses triggers the workflow with the address myemailaddr@pipedream.net
:
Expired Token
error when trying to read an email attachment$.flow.delay
or $.flow.rerun
which introduces a gap of time between steps in your workflow, then there’s a chance the email attachment’s URL will expire.
To overcome this, we suggest uploading your email attachments to your Project’s File Store for persistent storage.
steps.trigger.event
, which you can access in any workflow step.
steps.trigger.event
variable. You can reference this variable in any step of your workflow.
You can reference your event data in any code or action step. See those docs or the general docs on passing data between steps for more information.
The specific shape of steps.trigger.event
depends on the trigger type:
Property | Description |
---|---|
body | A string or object representation of the HTTP payload |
client_ip | IP address of the client that made the request |
headers | HTTP headers, represented as an object |
method | HTTP method |
path | HTTP request path |
query | Query string |
url | Request host + path |
Property | Description |
---|---|
interval_seconds | The number of seconds between scheduled executions |
cron | When you’ve configured a custom cron schedule, the cron string |
timestamp | The epoch timestamp when the workflow ran |
timezone_configured | An object with formatted datetime data for the given execution, tied to the schedule’s timezone |
timezone_utc | An object with formatted datetime data for the given execution, tied to the UTC timezone |
steps.trigger.context
steps.trigger.event
contain your event’s data. steps.trigger.context
contains metadata about the workflow and the execution tied to this event.
You can use the data in steps.trigger.context
to uniquely identify the Pipedream event ID, the timestamp at which the event invoked the workflow, and more:
Property | Description |
---|---|
deployment_id | A globally-unique string representing the current version of the workflow |
emitter_id | The ID of the workflow trigger that emitted this event, e.g. the event source ID. |
id | A unique, Pipedream-provided identifier for the event that triggered this workflow |
owner_id | The Pipedream-assigned workspace ID that owns the workflow |
platform_version | The version of the Pipedream execution environment this event ran on |
replay | A boolean, whether the event was replayed via the UI |
trace_id | Holds the same value for all executions tied to an original event. See below for more details. |
ts | The ISO 8601 timestamp at which the event invoked the workflow |
workflow_id | The workflow ID |
workflow_name | The workflow name |
steps.trigger.context.id
should be unique for every execution of a workflow.
steps.trigger.context.trace_id
will hold the same value for all executions tied to the same original event, e.g. if you have auto-retry enabled and it retries a workflow three times, the id
will change, but the trace_id
will remain the same. For example, if you call $.flow.suspend()
on a workflow, we run a new execution after the suspend, so you’d see two total executions: id
will be unique before and after the suspend, but trace_id
will be the same.
You may notice other properties in context
. These are used internally by Pipedream, and are subject to change.