Why does fs.readdirSync("/tmp") sometimes fail to find files in a ClickUp task upload step after successfully downloading attachments to /tmp folder?

This topic was automatically generated from Slack. You can find the original thread here.

Hi people! Sorry for asking for the same problem i did a days before, but couldn’t solve it. I’m trying to download attachments into /tmp folder, and in the next step, uploading them into a ClickUp task. The thing is that even if the download step succeed, sometimes the upload step could not found any file while doing fs.readdirSync("/tmp"), i’ve tried adding a delay with the control flow actions, but didn’t work. Here is the code:

• download attachment step:

const fileData = Buffer.from(attachmentData.data, "base64");
fs.writeFileSync(`${TMP_DIR}/${attachment.filename}`, fileData)
console.log(fs.readdirSync(`${TMP_DIR}`)

This step always succeed, and i can verify that the files are being saved in the tmp file

• upload attachment step:

const { TMP_DIR } = steps.define_constants.$return_value;
    const taskId = steps.create_task.$return_value;
    const files = fs.readdirSync(TMP_DIR).filter(file => file !== "__pdg__" && file !== "__pds__");
    console.log("Files in TMP_DIR:", files);

    if(files.length){
      
      for(const file of files){
      const filePath = `${TMP_DIR}/${file}`;
      const fileStream = fs.createReadStream(filePath);
      const formData = new FormData();
      formData.append(`attachment`, fileStream); 

Sometimes, “files” var doesn’t contain the files that was previously saved.

• remove files step:

const files = fs.readdirSync(TMP_DIR).filter(file => file !== "__pdg__" && file !== "__pds__");
    for (const file of files) {
      const filePath = `${TMP_DIR}/${file}`;
      if(fs.existsSync(filePath)){
        await fs.promises.unlink(filePath)
        console.log(`Deleted file ${file}`);
      } 

The /tmp directory can be empty due to either the workflow going “cold” or by new works being spun up to meet your concurrency needs

The File Store system helps with this, by instead storing files in a separate permanent store outside of the workflow environment, which won’t be deleted unlike /tmp

Sometimes it’s just not empty, even when the tmp directory find files, i have errors creating the RedStream:

console.log("Files in TMP_DIR:", files);
const filePath = `${TMP_DIR}/${file}`;
const fileStream = fs.createReadStream(filePath);

Also, to upload files i’ll have to use the tmp directory, since i’m working with email attachment, first i’ll have to download them with the gmail API, upload to /tmp and then to file storage with this line:

const file = await $.files.open('recording.mp3').fromFile('/tmp/recording.mp3')

So, I might have the same problem as can be seen in the image

For ReadStream issue, maybe the “remove files step” runs before the “upload attachment step” is finished, so the file is deleted before it can be read. I think the “upload attachment step” should ideally return a Promise that resolves after the the file uploads have completed.

It’s not ideal, but to narrow down the issue, does the workflow succeed if you put all the code in the same step?