What could be causing timeouts on an endpoint with frequent posts?

This topic was automatically generated from Slack. You can find the original thread here.

Hi, we keep getting timeouts on http triggers, there’s a post on this endpoint every 2-3 minutes or so

Here’s the details

Hi , that error means that the workflow execution itself is timing out. It looks like you might have the max timeout set to 60 seconds in your workflow’s settings. Can you try increasing that timeout and replaying the event to see if that solves the issue?

Hi , the error is not constant. We do have the timeout at 60 seconds. If I hit replay it works fine. While inspecting the execution I see that no step executed. It looks like it’s the trigger component itself that hangs and crashes after the max execution time. The problem started yesterday around 11 AM EST.

Unfortunately there’s a known issue with the details of those timeouts: we won’t show the partial step execution, but the timeout is likely happening at a certain step.

Would you mind visiting your workflow’s settings and enable the option to share the workflow with Pipedream support? I can take a look at the workflow to see if I can help investigate more.

ah, in that case I might know the problem, it’s the last step that is likely to cause it, it’s a call on a Google api that is noticeably slow.

I enabled the option. Here’s a link to one of the timeouts:

yes, that’s exactly what I would investigate. You can either try to optimize that request, rerun on long requests, or increase the timeout to give yourself more time to handle the execution

yup, I’ll look into it. If you’re able to confirm which step did timeout that would be great, if not I’ll just assume it’s the one I suspect and implement a fix

I can’t see the failed step in these timeout cases (it’s a gap in the basic logging we implement for timeouts), but I let the team know about this issue again. I looked at some successful executions and saw that last step take > 25 seconds, so it does look like that’s the biggest culprit

perfect, thank you :slightly_smiling_face: