This topic was automatically generated from Slack. You can find the original thread here.
Walid : Hello, I hope everyone is well! I am working on a workflow that is triggered by email and takes a large attachment (using the instructions as per upload.pipedream.net) I then parse the email and upload the base64code output to s3.
It is working on some tries and then failing with the exact same variables (pdf file, email, etc) and giving me n Out of Memory error.
Steps dealing with the content_url: Trigger: Large upload email attachment (around 5 mb)
- Capture content:
const stream = require("stream");
const { promisify } = require("util");
const fs = require("fs");
const got = require("got");
const pipeline = promisify(stream.pipeline);
await pipeline(
got.stream(steps.trigger.event.mail.content_url),
fs.createWriteStream(`/tmp/raw_email`)
);
// Now read the file and parse its contents into the `parsed` variable
// See <https://nodemailer.com/extras/mailparser/> for parsing options
- Parse content:
const simpleParser = require("mailparser").simpleParser;
const fs = require("fs");
const f = fs.readFileSync(`/tmp/raw_email`);
this.parsed = await simpleParser(f);
The next steps are to upload to S3
and finally I attempted to use a remove files from storage code, but it does not seem to be working:
// The files stored on disk may persist in our VM for a few executions
// of our workflow. So we need to remove the files we just processed
// to make sure we don't process them again on the next run.
const fs = require("fs")
fs.unlinkSync(`/tmp/raw_email`)
console.log(`Deleted /tmp/raw_email`)
Can you please let me know whats the best way to solve the out of memory error