I tried copying the openly available “email to s3 workflow” to upload files that I receive regularly by email.
Problem 1.
After I got it set up and customized and tested, I tried sending emails and kept getting notifications that the email send had failed. This turned out to be because the email, including attachments, was too large.
Solution 2. Looking through the documentation, I learned about the alternative email address we need to use for anything in the Mb range and found sample code around getting attachments from that email.
Problem 2. However, the new flow now requires new code because the information comes from a file at a url rather than from the event object.
From Triggers - Pipedream
I got a code sample that requires these imports:
const stream = require("stream");
const { promisify } = require("util");
const fs = require("fs");
const got = require("got");
const simpleParser = require("mailparser").simpleParser;
However, that sample code fails with this error:
ErrorMust use import to load ES Module: /opt/ee/node_modules/got/dist/source/index.js require() of ES modules is not supported. require() of /opt/ee/node_modules/got/dist/source/index.js from /opt/ee/c_jGfv7x3/index.js is an ES module file as it is a .js file whose nearest parent package.json contains “type”: “module” which defines all .js files in that package scope as ES modules. Instead rename /opt/ee/node_modules/got/dist/source/index.js to end in .cjs, change the requiring code to use import(), or remove “type”: “module” from /opt/ee/node_modules/got/package.json.
I think that the got
library may have been upgraded since that code was posted and no longer be compatible.
So now I’m at the point where I have to learn about the current state of the js ecosystem or at least how to bypass got
to get my attachments to s3