I have a workflow that is processing a relatively large amount of data. I’m querying a Notion database and expecting between 400-500 results. I then use those results to make two HTTP requests to our reporting system to get each query result’s 1) spend from yesterday and 2) lifetime spend. This generates another 400-500 results for each of the requests. I think take the results from those two HTTP requests and PATCH the results back into Notion for each of the 400-500 results from the initial Notion query step. I’m essentially just wanting to automate updating our Notion system with the spend for each result in the query.
When testing the workflow it takes quite awhile to complete. I’m wondering if there is a way to see the amount of memory used by a workflow somewhere in the system. Or, do I just have to keep increasing the memory limit on the workflow until it doesn’t run out of memory?
Also, I’ve seen the documentation about streaming to the temp folder, but is that something I can only do when downloading a file, or would I be able to stream the results of my Notion query and reporting system HTTP requests to a temp file?
Makes total sense. I was trying to see what all I could accomplish within one single workflow, but given the size of the data and the different APIs being called within the workflow, I’ll just split it out like I have done in other situations. Thanks for the feedback and all the help.