How to Create a Local CSV File in /tmp/ Directory and Upload it to AWS S3 using NodeJS?

This topic was automatically generated from Slack. You can find the original thread here.

I would like to create a local CSV file in the /tmp/ directory, and then upload it in AWS S3. I am using NodeJS. I have seen an example in pipedream docs (Working with the filesystem in Node.js), but I am not sure how should I populate the following constants

const { path, cleanup } = await file();

To create a local CSV file in the /tmp/ directory and then upload it to AWS S3, you can follow these steps: 1. First, create the CSV file in the /tmp/ directory using the fs module. Here’s an example of how to create a CSV file with some sample data: import fs from "fs"; export default defineComponent({ async run({ steps, $ }) { const csvData = "column1,column2\nvalue1,value2\nvalue3,value4"; await fs.promises.writeFile('/tmp/myfile.csv', csvData); } }); 2. Next, use the AWS S3 - Upload File - /tmp action to upload the file to your S3 bucket. Here’s how to use the action in your workflow: - Search for the AWS app from the step menu - Select the S3 - Upload File - /tmp action - Connect your AWS account - Select an AWS Region or pass a string from a previous step - Select an S3 Bucket Name or pass a string from a previous step - In the “File Path” field, enter /tmp/myfile.csv or pass a string from a previous step - Optionally, you can specify a “Folder Path” or “S3 Filename Key” if needed Once you’ve set up the action, it will upload the CSV file you created in the /tmp/ directory to your specified S3 bucket. Regarding the const { path, cleanup } = await file(); line you mentioned, it’s from an example using the tmp-promise package. In this case, you don’t need to use that package since you’re directly creating and uploading the file in the /tmp/ directory.

Thnk you ! I will test with your recommendations

You’re welcome! If you have any questions or need further assistance, feel free to ask. Good luck with your implementation!

I have the following error message:

Warning
Code was still running when the step ended. Make sure to await all Promises, or promisify callback functions. May be a false positive

Any clue about what I’m doing wrong ?
Here is my code :

export default defineComponent({
  async run({ steps, $ }) {
    const csvHeader = "ID,Title,Description,Availability,Condition,Price,Link,Image_link,Brand,Google_product_category\n";
    await fs.promises.writeFile('/tmp/myfile.csv', csvHeader);

    steps.Parse_JSONL_response_and_create_object.$return_value.forEach(async function (value, i) {
      await fs.promises.appendFile('/tmp/myfile.csv', value + "\n");
    });
    return "ok";
  }
});

The issue is caused by not properly awaiting the asynchronous operations inside the forEach loop. Instead of using forEach, you should use a for...of loop to ensure that each operation is awaited before moving to the next one. Here’s the updated code: ```
import fs from “fs”; export default defineComponent({ async run({ steps, $ }) { const csvHeader = “ID,Title,Description,Availability,Condition,Price,Link,Image_link,Brand,Google_product_category\n”; await fs.promises.writeFile(‘/tmp/myfile.csv’, csvHeader); for (const value of steps.Parse_JSONL_response_and_create_object.$return_value) { await fs.promises.appendFile(‘/tmp/myfile.csv’, value + “\n”); } return “ok”; } });

Hi Pi ! I would like to apply a tag in my file in AWS, can i do it in the step 2 (AWS S3 - Upload File - /tmp action to upload the file to your S3 bucket), or should I use a new step for that ?

Yes, you can apply a tag to your file while using the AWS S3 - Upload File - /tmp action. To do this, you can add a “Tags” field in the action configuration. The tags should be in the format key1=value1&key2=value2. Here’s how to add tags in the S3 - Upload File - /tmp action: 1. Search for the AWS app from the step menu 2. Select the S3 - Upload File - /tmp action 3. Connect your AWS account 4. Select an AWS Region or pass a string from a previous step 5. Select an S3 Bucket Name or pass a string from a previous step 6. In the “File Path” field, enter /tmp/myfile.csv or pass a string from a previous step 7. In the “Tags” field, enter your tags in the format key1=value1&key2=value2 Once you’ve set up the action, it will upload the CSV file you created in the /tmp/ directory to your specified S3 bucket and apply the specified tags.

I do not have the “Tags” field in S3 - Upload File - /tmp action

I apologize for the confusion earlier. It seems that the “Tags” field is not available in the S3 - Upload File - /tmp action. In this case, you can use an AWS SDK in a code step to upload the file and apply the tags. Here’s an example using the aws-sdk package: import AWS from "aws-sdk"; export default defineComponent({ props: { bucketName: { type: "string", label: "S3 Bucket Name", }, tags: { type: "string", label: "Tags (format: key1=value1&key2=value2)", }, }, async run({ steps, $ }) { const s3 = new AWS.S3(); const fileContent = await fs.promises.readFile("/tmp/myfile.csv"); const params = { Bucket: this.bucketName, Key: "myfile.csv", Body: fileContent, Tagging: this.tags, }; return new Promise((resolve, reject) => { s3.upload(params, (err, data) => { if (err) { reject(err); } else { resolve(data); } }); }); }, }); Replace the S3 - Upload File - /tmp action with this code step. Make sure to provide the bucketName and tags props when configuring the step. This code will upload the file to the specified S3 bucket and apply the tags you provide.

thank you, I have now the following error message :
ReferenceError fs is not defined

I apologize for the oversight. It seems I missed adding the import statement for the fs module. Please add the following import statement at the beginning of the code: import fs from "fs"; Here’s the updated code with the import statement: ```
import fs from “fs”; import AWS from “aws-sdk”; export default defineComponent({ props: { bucketName: { type: “string”, label: “S3 Bucket Name”, }, tags: { type: “string”, label: “Tags (format: key1=value1&key2=value2)”, }, }, async run({ steps, $ }) { const s3 = new AWS.S3(); const fileContent = await fs.promises.readFile(“/tmp/myfile.csv”); const params = { Bucket: this.bucketName, Key: “myfile.csv”, Body: fileContent, Tagging: this.tags, }; return new Promise((resolve, reject) => { s3.upload(params, (err, data) => { if (err) { reject(err); } else { resolve(data); } }); }); }, });

Thanks , I have now the following error message:
AccessDenied
Access Denied
DETAILS

    at Request.extractError (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/services/s3.js:711:35)
    at Request.callListeners (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/sequential_executor.js:106:20)
    at Request.emit (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/sequential_executor.js:78:10)
    at Request.emit (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/request.js:686:14)
    at Request.transition (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/request.js:22:10)
    at AcceptorStateMachine.runTo (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/state_machine.js:14:12)
    at null.null (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/state_machine.js:26:10)
    at Request.null (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/request.js:38:9)
    at Request.null (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/request.js:688:12)
    at Request.callListeners (/tmp/__pdg__/dist/code/8dfc83c1c59e58c55eb2a91a6d1a155a3c3f9a72529db9a1d959960437fbd56f/node_modules/.pnpm/aws-sdk@2.1454.0/node_modules/aws-sdk/lib/sequential_executor.js:116:18)

Detailed/code mode
AWS Account
AWS - user-pipedream
S3 Bucket Name
airnest-public-data
Tags (format: key1=value1&key2=value2)
public-data=yes

It looks like you’re encountering an “Access Denied” error when trying to interact with the S3 bucket. This error is typically caused by insufficient permissions on the AWS IAM user associated with the access key and secret key you’re using in your Pipedream component. To resolve this issue, please ensure that the IAM user has the necessary permissions to perform the desired operation on the specified S3 bucket. You can follow the AWS Instructions to create an IAM user with the appropriate permissions. If you’re still encountering issues after verifying the IAM user’s permissions, please let me know so I can help you further.

I had no issue when using the previous action “AWS S3 - Upload File - /tmp action”.
Perhaps we should add the fields “AWS Account” and “AWS Region” like for the action “AWS S3 - Upload File - /tmp action” ?

It’s great to hear that you had no issues with the “AWS S3 - Upload File - /tmp action”. If you want to use the same action with the additional fields “AWS Account” and “AWS Region”, you can follow these steps: 1. Search for the AWS app from the step menu. 2. Select the “S3 - Upload File - /tmp” action. 3. Connect your AWS account. 4. Select an AWS Region or pass a string from a previous step. 5. Select an S3 Bucket Name or pass a string from a previous step. You can also configure optional fields like “S3 Filename Key”, “File Path”, and “Folder Path” if needed. Once you’ve set up the action, click “Test” to run the action and test the configuration. If you still encounter any issues or need further assistance, please let me know.

In “AWS S3 - Upload File - /tmp action” I cannot use any tag for my file

You’re right, the “AWS S3 - Upload File - /tmp action” does not have a built-in option for adding tags to the uploaded file. However, you can achieve this by using a custom code step after the “AWS S3 - Upload File - /tmp action” step to add tags to the uploaded file. First, follow the steps to use the “AWS S3 - Upload File - /tmp action” as mentioned in the documentation. Next, add a custom code step and use the AWS SDK to add tags to the uploaded file. Here’s an example code snippet: ```
import { S3 } from “aws-sdk”; const s3 = new S3({ accessKeyId: process.env.AWS_ACCESS_KEY_ID, secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, region: “your-region”, // Replace with the appropriate AWS region }); const params = { Bucket: “your-bucket-name”, // Replace with your S3 bucket name Key: “your-s3-key”, // Replace with the S3 key of the uploaded file Tagging: { TagSet: [ { Key: “your-tag-key”, // Replace with your desired tag key Value: “your-tag-value”, // Replace with your desired tag value }, ], }, }; await s3.putObjectTagging(params).promise();

Thanks I have the following error message now:
SyntaxError
Named export ‘S3’ not found. The requested module ‘aws-sdk’ is a CommonJS module, which may not support all module.exports as named exports. CommonJS modules can always be imported via the default export, for example using: import pkg from ‘aws-sdk’; const { S3 } = pkg;”
It is probably related to the AWS SDK installation that is unclear for me

I apologize for the confusion. Since the aws-sdk is a CommonJS module, you’ll need to use a different import syntax. Please update your import statement as follows: ```
import AWS from “aws-sdk”; const { S3 } = AWS;