How to send API request from big data file?

Indented goal is to send data from a quite large csv file to an external API without running into rate limits or throttling. To be precise; all API POST request need to “simulate” live traffic (as if every submit is not sources from the bulk file, rather submissions on a live website. Trying to figure out how to implement this “mimic real user behavioir”.

can anyone please help me?

Hi @pillcollin! The good thing about this task, is that the data is in CSV format, which means it can be streamed, line by line. Here’s an approach I’d explore:


  • Upload file to workflow as an attachment.
  • Verify rate limits per API key and per account, consider any other automations/apps using the same API key

Trigger: runs a timer

  1. Read 1 line from the csv file
  2. Send a POST request to the external API
  3. Update $checkpoint with latest line number

Repeat up to a maximum number of lines that won’t hit rate limits or throttling. Also each time the workflow runs, it should refer to $checkpoint to get the current line number.

If you’re really concerned about processing too many lines, you can process them one at a time. This would make the workflow timer the singular controller of when submissions are sent to the external API.

Once the all lines in the CSV have been processed, it could be helpful to send yourself an email. I’m guessing you’ll only want to read the CSV file one time. If so, you can include a reminder to disable the workflow in the email.