What is the optimal rate limiting for a workflow with 1200 records, 1 API request per second, and a processing workflow of 1 request per second?

This topic was automatically generated from Slack. You can find the original thread here.

I have a workflow that is pulling 1200 records from google sheets as an array and then sending them individually to a processing workflow that is looking them up on another API that’s rate limited to 1 per second. The first time I did it I applied no rate limiting on the sending workflow and it caused an error on the receiving workflows server/webhook “429 statusText: Too Many Requests” so I told it to only send 25 per/s in the rate limiting settings and now its timing out. The processing workflow is set to only process 1 per second. Is this just a balancing act that I’ve got wrong or is it something more fundamentally flawed with my architecture?

I think one method to solve this is to have the workflow A store the data into a Data store, then have the workflow B get data from the Data Store, process then, and clear the Data store.

Basically you changed from synchronous processing (via HTTPs call), to asynchronous processing which is more fault-tolerant

Great, I thought it might revolve around data stores thanks for confirming. To further my understanding - if I kept doing it the way I was I would need a timeout value on workflow A of (1200/60 , ie 20 minutes)? Workflow A isn’t capable of dumping 1200 entries into Workflow B’s queue system (set to 5000 currently)?

Yeah if you keep doing the way it is now, you will need a throttle mechanism for workflow A so that it will not hit Pipedream Webhook rate limit