Why Does My Workflow Time Out in 30 Seconds Despite Adjusting the Timeout to 10 Minutes?

This topic was automatically generated from Slack. You can find the original thread here.

anyone know have this issue before? I open my workflow, go to Settings, adjust the timeout to be 10 minutes… but then when I run my workflow I get timed out and it says “… input is too large to process in 30 seconds”

The error might display in your workflow inspector but it could be occurring within the Source that powers the trigger for your workflow.

Sources are separate serverless functions that poll APIs or receive webhooks from APIs, and they have a hard cap of 30 seconds.

You can check your sources logs under the Sources tab for more details on the error.

There’s also a specific support request category for integrations in when opening a support ticket.

thank you… do you know how to clear the cache? I have 20 PDF’s in Google Drive… I’ve updated the PDF’s… they have the same links as before, just an updated version. Since I used “cache:” previously… it’s pulling version 1, not the updated version. Do you know how to clear the cache so it’ll pull the new ones? I cannot simply remove “cache”, because then it’s timing out.

You can try recreating your trigger and see if the pulled version is updated

it did not work. I even created an entirely new connection to the same mySQL DB, then recreated the trigger and it still used the old cached file. Any other ideas?

Could you please file a support ticket at https://pipedream.com/support so we can take a look at your workflow? Thanks