This topic was automatically generated from Slack. You can find the original thread here.
Hey, I’ve got an extraction workflow that’s iterating over ~800 items which require two subsequent calls to the vendors API to pull out the relevant information. I have a 100 requests per minute (yikes) and need to find the best way to execute those 100 requests, wait a minute, then exectue then next 100 and so on. I had a look at the delay function, but that occurs at the end of the workflow step and currently got a loop working through the 800 items. If I use SetTimeout it’s going to bill my credits for the time spent waiting. I’m fast falling down a rabbit hole of odd solutions like; getting to the end of the workflow and re-calling the workflow with a page/offset value to paginate (this didn’t seem to work). Or can I get workflow steps to jump back and repeat within the workflow which would then allow the delay function to work. I’m wondering if anyone has experience with implementing this in Pipedream and what the right approach is that balances scalability and credit usage.
this is extremely helpful. I think the fastest way through is likely to be using the $.flow.rerun and pushing to the data store the state at the end of each batch.
This said, I really want to look into the dispatcher worker approach. It sounds more readable from the Pipedream UI for the next person who finds this.
Thank you! This is awesome, I really appreciate it