user-1
(User 1)
February 11, 2025, 11:27pm
1
This topic was automatically generated from Slack. You can find the original thread here .
Hi
I am working on a .csv file (12 MB) with 120.000 lines of transactions.
I need to loop through all of them and send them as bulk HTTP request with 20 in each.
What I the best was to handle this?
My idea is to parse the csv, and store it in tmp, but I am afraid to run into size limits…
Please advise
user-1
(User 1)
February 11, 2025, 11:27pm
2
There is plenty of room in the /tmp
folder!
user-1
(User 1)
February 11, 2025, 11:27pm
4
You could also use the file store for persistent storage in between executions.
user-1
(User 1)
February 11, 2025, 11:27pm
5
So you could have one single workflow looping over all of your records for ~10 minutes, save what’s left to process in the file store, and then use $.flow.rerun()
to refresh its timeout timer to zero and continue processing.
This way, that one workflow could continue processing everything for hours (if it takes that long) without timing out.
user-1
(User 1)
February 11, 2025, 11:27pm
6
Otherwise, you could also trigger another workflow for each batch of 20 (but that will probably use more credits in total).
But we have a use case where we do just that. And the code is pretty simple:
while (list.length > 0) {
await fetch("https://1234567890.m.pipedream.net", {
method: "POST",
body: JSON.stringify(list.splice(0, 20)),
});
}