How Much Memory Should I Allocate for a 10-Step Workflow with NodeJS Steps and Google Sheets Integration?

This topic was automatically generated from Slack. You can find the original thread here.

So for an average workflow - 10 steps, a few nodeJS steps that format out an array of arrays of approx 600 objects and send to Google Sheets.

How much memory should I allocate to the workflow. I cant see in the event history how much the workflow consumes and I need it to run. I have had one fail on memory in about 20 runs at the base 256MB

It’s more like of an art than a science. I’d say try 512MB and keep an eye out for errors

I’m not too fussed about spending the credits but I would think it IS a science. How is the memory allocated? Can we see what the memory usage is?

It depends on how deterministic your workflow’s memory usage is. We’ve noticed that AI-based workflows’ memory usage fluctuates a lot, for example. Is your workflow deterministic in this sense, by any chance?

We run the workflows in isolated lambda functions containers, so there is a little bit overhead in memory usage allocated for the container construction and the rest is available at runtime

No its fairly simple.

  1. Zoom event trigger
  2. Node check a value to continue workflow
  3. Get Webinar Participant data
  4. Node convert to array of arrays
  5. Google Sheet population in various formats

Also if you increase the memory, there will be a proportional increase in CPU, as mentioned in this guide

That is quite helpful thanks. I’ll bump the critical workflows to 700+MB as 3 credits are fine for peace of mind.

FYI this is in our backlog: [FEATURE] I'd like to see the memory a given execution consumes · Issue #5329 · PipedreamHQ/pipedream · GitHub