This topic was automatically generated from Slack. You can find the original thread here.
Hey y’all!
Is there a way to increase the timeout limit beyond the 750s?
I have a workflow (my first one!) that is translating HTML files. They are, however quite long (without HTML, it’s up to 4000 words already).
On longer articles, I’m getting a timeout. My limit is currently maxed to 750s and I already maxed out the CPU.
Is there anything else I could do? I’m new to this, so maybe there’s something obvious I might me missing. Maybe the throttle workflow execution option? Also saw somewhere that one might be able to combine different workflows?
Thanks a ton for any help! If I don’t get this managed somehow, this whole thing won’t work out…
Hi Alexander, great question. Unfortunately the 750 second timeout is a hard limit and it’s not currently possible to extend it any longer.
However, like you mentioned you can split your workflow into different parts, I use this strategy personally for iterating over large amounts of records:
So a potential way to handle this in your case could be:
Workflow A - retrieves HTML page, splits it into chunks based on <section>’s or <articles>, calls workflow B with the HTML chunks with an index
Workflow B - receives a single chunk at a time, translates it, and stores the chunk within a Data Store or database at the appropriate index
I hope this gives some ideas!