This topic was automatically generated from Slack. You can find the original thread here.
So for an average workflow - 10 steps, a few nodeJS steps that format out an array of arrays of approx 600 objects and send to Google Sheets.
How much memory should I allocate to the workflow. I cant see in the event history how much the workflow consumes and I need it to run. I have had one fail on memory in about 20 runs at the base 256MB
It depends on how deterministic your workflow’s memory usage is. We’ve noticed that AI-based workflows’ memory usage fluctuates a lot, for example. Is your workflow deterministic in this sense, by any chance?
We run the workflows in isolated lambda functions containers, so there is a little bit overhead in memory usage allocated for the container construction and the rest is available at runtime