This topic was automatically generated from Slack. You can find the original thread here.
I keep getting a timeout error
TIMEOUT: TIMEOUT
at Timeout._onTimeout (file:///var/task/lambda_handler.mjs:919:23)
at listOnTimeout (node:internal/timers:569:17)
at process.processTimers (node:internal/timers:512:7)
How do I prevent this from happening and increase the timeout limit? I am uploading big files to Google Drive
I’m sorry to hear you’re experiencing timeout issues. Workflows have a default execution limit, which defines the time the workflow can run for a single execution until it’s timed out. To address this, you can follow these steps: 1. Figure out why your code is running for longer than expected. Timeouts are specific to your workflow, and can be caused by processing large amounts of data or waiting for a third-party API response. 2. If it’s expected that your code takes a long time to run, you can increase the execution limit of your workflow in your workflow’s settings. Please note that increasing the execution limit might consume more credits. Make sure to check your daily credits limit to avoid any quota exceeded errors.
Hi, I’m having the same issue intermittently. However, I’m only getting a small amount of data or writing a few cell’s worth of data to Google drive. Also, my workflow times out at ~17,000ms, which is less than the minimum workflow timeout of 30 seconds. Can anyone help pls?
TIMEOUT: TIMEOUT
at Timeout._onTimeout (file:///var/task/lambda_handler.mjs:919:23)
at listOnTimeout (node:internal/timers:569:17)
at process.processTimers (node:internal/timers:512:7)