This topic was automatically generated from Slack. You can find the original thread here.
When I’m testing a workflow, I have a step that runs and then stops but doesn’t return a value, or an error. It just stops. My hunch is that it can’t seem to handle the amount of data being returned as if I limit the size of the query I’m making from Monday, it responds okay. It’s about 4,000 rows and each row has around 15 column values. Is there an upper limit to how much data a single step can handle?
It sounds like you might be encountering the Function Payload Limit Exceeded issue in Pipedream. The total size of console.log() statements, step exports, and the original event data sent to workflows cannot exceed a certain limit. This often happens when handling large amounts of data, such as your 4,000 rows with 15 column values each. To address this, you can try the following: 1. Write data to the /tmp directory: Instead of passing large data between steps using step exports, you can write the data to the /tmp directory in one step and read it in another step. This approach helps avoid exceeding the payload limit. 2. Optimize your query: If possible, try to limit the amount of data retrieved in your query by filtering or paginating the results. For more detailed guidance on handling large data in Pipedream, you can refer to the Pipedream documentation. If you continue to experience issues, please reach out to Pipedream Support for further assistance.
You can find documentation on the use of the /tmp directory in Pipedream workflows at the following links: - Pipedream Documentation - Pipedream Code Documentation - Node.js Code Documentation The /tmp directory is available for storing files during the execution of a workflow. Files stored in /tmp are only guaranteed to be available for the duration of the workflow execution and are not accessible across workers. You have full read-write access to this directory within a workflow, with a specific size limit for storage. If you have any specific questions about using the /tmp directory, feel free to ask!
that’s correct, if the step export is too large it can be elided as it would crash the web page. Is there a way you can return a limited number of rows and then paginate when needed?
I tell you what, I ended up following the guides to use the /tmp file and use the tmp-promise to clean up, and it resolved my issue, not to mention the performance was notably improved when testing. Speed of response was excellent when using a tmp file to pass large payloads from step-to-step.