Upload email attachments to S3
@dylan
code:
data:privatelast updated:5 years ago
today
Build integrations remarkably fast!
You're viewing a public workflow template.
Sign up to customize, add steps, modify code and more.
Join 1,000,000+ developers using the Pipedream platform
steps.
trigger
Email
Deploy to generate unique email address
This workflow runs on Pipedream's servers and is triggered when an email is received.
steps.
CONSTANTS
auth
to use OAuth tokens and API keys in code via theauths object
code
Write any Node.jscodeand use anynpm package. You can alsoexport datafor use in later steps via return or this.key = 'value', pass input data to your code viaparams, and maintain state across executions with$checkpoint.
async (event, steps) => {
1
2
}
3
this.TMP_DIR = "/tmp"
steps.
save_attachments_to_temp_storage
auth
to use OAuth tokens and API keys in code via theauths object
code
Write any Node.jscodeand use anynpm package. You can alsoexport datafor use in later steps via return or this.key = 'value', pass input data to your code viaparams, and maintain state across executions with$checkpoint.
async (event, steps) => {
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
}
16
const fs = require("fs")

const { TMP_DIR } = steps.CONSTANTS

for (a of event.attachments) {
  const fileData = Buffer.from(a.content_b64, 'base64')
  // Save to disk using the same filename as the attachment
  fs.writeFileSync(`${TMP_DIR}/${a.fileName}`, fileData)
  console.log(`Writing ${a.fileName} to ${TMP_DIR}`)
}

const tmpFiles = fs.readdirSync(TMP_DIR)
console.log(`Saved ${tmpFiles.length} files`)
this.tmpFiles = tmpFiles
steps.
save_to_s3
auth
to use OAuth tokens and API keys in code via theauths object
(auths.aws)
params
Bucket
 
string ·params.bucket
code
Write any Node.jscodeand use anynpm package. You can alsoexport datafor use in later steps via return or this.key = 'value', pass input data to your code viaparams, and maintain state across executions with$checkpoint.
async (event, steps, params, auths) => {
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
}
26
// NOTE: we could have just sent the base64-encoded data to S3 in this
// example, but the workflow is meant to show how to save a file to disk
// and send it to an external service.
const fs = require("fs")
const AWS = require("aws-sdk")

const { TMP_DIR } = steps.CONSTANTS
const { bucket } = params
const { accessKeyId, secretAccessKey } = auths.aws

const s3 = new AWS.S3({
  accessKeyId, 
  secretAccessKey,
})

this.S3Responses = []
for (file of fs.readdirSync(TMP_DIR)) {
  const fileData = fs.readFileSync(`${TMP_DIR}/${file}`)
  // Store at prefix named for subject + current epoch
  const Key = `${event.subject.replace(/[/ ]/g, '_')}-${+new Date()}/${file}`
  const uploadParams = { Bucket: bucket, Key, Body: fileData }
  this.S3Responses.push(await s3.upload(uploadParams).promise())
  console.log(`Uploaded ${file} to S3!`)
}
steps.
remove_files_from_temp_storage
auth
to use OAuth tokens and API keys in code via theauths object
code
Write any Node.jscodeand use anynpm package. You can alsoexport datafor use in later steps via return or this.key = 'value', pass input data to your code viaparams, and maintain state across executions with$checkpoint.
async (event, steps) => {
1
2
3
4
5
6
7
8
9
10
11
12
}
13
// The files stored on disk may persist in our VM for a few executions
// of our workflow. So we need to remove the files we just processed
// to make sure we don't process them again on the next run.
const fs = require("fs")

const { TMP_DIR } = steps.CONSTANTS

for (file of fs.readdirSync(TMP_DIR)) {
  fs.unlinkSync(`${TMP_DIR}/${file}`)
  console.log(`Deleted ${file}`)
}