Flow B: (triggered multiple times by A, once for each file)
Processes the specific file and posts it to its final destination
If I manually test FlowA, then manually select the new event in Flow B, and test Flow B, then everything works fine.
If, however, I manually test Flow A after deploying Flow B, then it doesn’t work. Flow B is successfully triggered and can see the correct filename, but it cannot find the file.
First off, welcome to Pipedream! Happy to have you!
Per the Pipedream /tmp dir doc here, the /tmp dir is supposed to be used within a workflow, and it is not supported for multiple workflow. So if you’re using it for multiple workflows, Pipedream won’t be able to guarantee it would work as you expect. Please read the document to understand how Pipedream /tmp dir works
For your usecase, I think you can upload the file into a Cloud storage provider such as Google Drive, Dropbox, etc… Then pass the download link between the workflows.
Thank you so much for your helpful reply. Prompted by your suggestion, I rewrote Flow A to pass the file contents rather than the file name to Flow B (so all the file handling is contained within a single flow), and all is now well .
However, I’m still surprised by the problem I encountered: According to the documentation it’s certainly true that ‘EEs can be destroyed at any time’, but it also adds ‘for example after about 10 minutes of receiving no events’. It seems odd that the files persist for quite some time when testing manually, but then seem to vanish instantly when fully deployed. If any expert out there has insight into this, it would be useful for future reference.
I think @pierce from Pipedream team might have some insights to this. Though my initial thoughts are /tmp dir is not designed for multi-workflows usecase, so any behavior you observed for multi-workflows usecase is just a side effect, and they might potentially changed/removed.
This means that once your workflows are deployed, they have individual separate /tmp directories.
Since you already have FTP as the source of truth for your files, I suggest that you make your modifications to the files, then reupload them to FTP if possible.
Or alternatively, consider a single FTP list command in Workflow A, and iterate over each file path, then trigger Workflow B with the file path of that file that needs to be downloaded, processed and pushed to another service.