How to Resolve the "Exceeded Default Timeout for This Workflow" Issue on Pipedream?

This topic was automatically generated from Slack. You can find the original thread here.

:wave: Hello, team! I need help please. I’m getting “you’ve exceeded the default timeout for this workflow, see Troubleshooting Common Issues

Hi , it seems 750 seconds are already the maximum limit so you can’t increase it. I think you might need to split the files into multiple parts with same size and process them separately

Error searching for documents: 57014 canceling statement due to statement timeout null
DETAILS


Timestamp
24/07/2023, 15:01:40Duration
747546ms

Will it help if i increase DB statement timeout?

No, Pipedream has a hard limit of 15 minutes execution time. So it won’t work

I receive the CSV url via a trigger, how would i split it up? thanks

I think your question is a great case for Pipedream bot to help. I’ll pass your question to the Pipedream bot now. Please feel free to ask it any question

To split the CSV file from the URL, you can follow these steps in your Pipedream workflow: 1. Use an HTTP action to fetch the CSV content from the URL. 2. Parse the CSV content and split it into smaller parts. 3. Process each part separately. Here’s an example of how to achieve this: ```
import { axios } from “@pipedream/platform”; import { parse } from “papaparse”; export default defineComponent({ props: { csvUrl: { type: “string”, label: “CSV URL”, }, }, async run({ $ }) { // Fetch CSV content from the URL const response = await axios($, { method: “GET”, url: this.csvUrl, }); // Parse the CSV content const parsedCsv = parse(response, { header: true, }); // Split the CSV data into smaller parts (e.g., 100 rows per part) const chunkSize = 100; const chunks = []; for (let i = 0; i < parsedCsv.data.length; i += chunkSize) { chunks.push(parsedCsv.data.slice(i, i + chunkSize)); } // Process each part separately (e.g., send each part to another action or API) // Replace this with your desired processing logic for (const chunk of chunks) { console.log(“Processing chunk:”, chunk); } }, });

Are you still having this issue ?

All good now thank you. Fixed by splitting the csv files and sending seperate requests