How to Debug Out of Memory Errors in a Workflow Where Files are Moved via /tmp?

This topic was automatically generated from Slack. You can find the original thread here.

How can I debug Out of Memory errors?

I am constantly hitting that error on a workflow, all files are being moved via /tmp and the documentation does not give much help after that. Note that the error is raised before even running the trigger which is even more baffling

Are you using built-in actions? Or custom code?

Are you able to check if you are streaming the data instead of loading it all into memory at once?

I have a couple of builin actions and 3 code blocks.
The most memory intensive thing going on is PyPDF to extract PDF text. But the whole PDF is 300Kb so it is impossible it is taking up 256Mb!

Got it, yeah I know it doesn’t look like that can be the issue but it could have some memory leakage there

See this similar issue: [BUG] Microsoft OneDrive Upload File action not streaming · Issue #15845 · PipedreamHQ/pipedream · GitHub

In our testing, a 20MB attachment still errored in a 2GB memory workflow in the OneDrive action, while the same in Google Drive (that had streaming) worked well in a 256MB memory workflow

We gotta add this to our docs

How can I dig deeper; or is there any possible workaround? I currently have zero clues on where to patch things as the OOM error is triggered at the top

Still, even with a memory leak: this is an incoming mail with a 200-300Kb attachment that is written to /tmp, read from Python, and then read again from JavaScript to submit it via an API call.

I would think I’d need a few hundred runs to exhaust 256Mb!

Yeah currently is hard to debug, but FYI you won’t be charged credits when testing in the builder

But feel free to submit a support ticket if you need further help

Is it possible to see what steps are using how much memory while in the builder?

There isn’t a view for that, but we noticed this bug happening with the One Drive Upload File action even with a few KBs file due to memory leak. After changing it to stream the file instead of loading it all to memory, this issue was fixed. See this commit: