Does Pipedream Provide a Facility for Temporarily Hosting a File at a Public URL for Upload to Airtable?

This topic was automatically generated from Slack. You can find the original thread here.

does pipedream have a facility for temporarily hosting a file at a public URL? I want to upload a file to Airtable, but this needs the file to be hosted at an accessible URL. Before I do the upload, I am receiving a PDF file back from Browserless in the body of an API response.

Pipedream does not have a built-in facility for temporarily hosting files at a public URL. However, you can use a third-party service like Amazon S3, Google Cloud Storage, or any other file hosting service to temporarily store your file and generate a public URL for it. Since you’re working with Browserless, you can use the following example component to generate a screenshot and store it in a third-party service: ```
import puppeteer from ‘puppeteer-core@14.1.0’; export default defineComponent({ props: { browserless: { type: “app”, app: “browserless”, }, }, async run({ steps, $ }) { const browser = await puppeteer.connect({ browserWSEndpoint: wss://chrome.browserless.io?token=${this.browserless.$auth.api_key} }); const page = await browser.newPage(); const url = “https://example.com”; const type = “png”; await page.goto(url); const screenshot = await page.screenshot(); // Save the screenshot to a third-party service and generate a public URL // Then, use the public URL to upload the file to Airtable await browser.close(); }, });

you can also add an HTTP trigger to your workflow and issue an HTTP response with the data

ahh, adding an additional trigger to the same workflow? The original workflow is triggered via email – will state be shared to the HTTP invocation?

not between executions, so you would need to persist the file somewhere. But since the workflow can have multiple triggers, you can do something like this:

  1. Email → write file
  2. HTTP → get file → issue HTTP response
    Technically you can write the file to /tmp but that’s local to the container and the data may be cleared across executions

I was thinking I could stuff the file in the data store temporarily

yeah if it’s JSON-serializable data, that’s completely appropriate

if it’s a file, I’d recommend base64 encoding it

yup. What’s the best ergonomics pattern for multiple triggers in the same workflow like that?

In the past I remember trying to get a context variable to switch between logical branches based on which trigger invoked the workflow; felt messy

Might be cleaner to have separate workflows in one project?

yeah I agree, it may be cleaner to separate. Technically we expose the trigger in steps.trigger.context.emitter_id but as you mentioned, it’s an ID so it’s a little odd to handle the conditional logic from that

Oh – is that a unique ID for the trigger?

Like if the workflow has three triggers, emitter_id will always be one of three values?

yes exactly

Oh sweet; I think I am using some other heuristic in a past workflow; I’ll be able to clean that up now

yes we recently just added that, it was definitely messier before! You had to duck type from the event shape and other weird magic :laughing: