How to Clear Logs for Workflow Once Limit is Exceeded on Pipedream?

This topic was automatically generated from Slack. You can find the original thread here.

I am unclear as to how to clear the logs for my workflow based on this notification:

You’ve exceeded the limit on the logs and data stored for this execution of your workflow. See Troubleshooting Common Issues - Pipedream

Please help. Thank you

Hi , looks like your workflow is using steps that returns large amount of data. Would you mind sharing which steps you’re using?

Without knowing your flow, one suggestion I have is that you write the data to /tmp and use the data you need from the workflow that returned data from tmp

What I believe happened: I ran 3 DALL-E image generation steps with returning the image as Base64 Json. That’s what was executing when the warning popped up. I have since switched back to a url and turned off data retention, which isn’t ideal.

@U06E5V9KHQU I am unfamiliar with running /tmp. I am running 26 steps in the workflow, of which 13 are Openai calls

I imagine I can only save to /tmp in custom steps I write vs the pre-defined steps?

You can save to /tmp in custom nodejs steps

There are also components like those from GDrive that you can specify /tmp as the path, and then you can use another GDrive component to download those files from tmp.

You will need to make sure to eventually clear tmp though because it eventually runs out of room

I need a better solution, long term. I never realized this was a limitation. I have 7 fairly robust workflows in this folder. Now I am concerned about this being a logging issue

Any thoughts on this thread for long term solution? Also, is there a way for me to clear the logs in this workflow?

Hi , I will create a ticket to improve the existing dall-e action. I’ll be fixed soon

Thank you!