This topic was automatically generated from Slack. You can find the original thread here.
Running into a sudden issue with the “S3 - Stream file to S3 from URL” workflow action, started suspiciously on August 1st, has been running for over a year without issues…
I have a zoom URL coming from a previous step (format {{steps.trigger.event.download_url}}?access_token={{steps.trigger.event.download_token}}) and I am now getting an error:
InvalidChunkSizeError
Only the last chunk is allowed to have a size less than 8192 bytes
Important to note that this workflow uploads to two places: S3 and Google Drive, both using the same file URL format, and Google Drive is still working correctly, but the S3 started failing on August 1st, so I suspect something is going on here, but do not know for certain.
So, I wanted to check with the Pipedream team to see if they were receiving any error reports on this, thank you!
So it looks like there is no option to stream a file to S3 anymore, I do have a code option, but I am curious why the stream option is no longer supported? Just want to make sure I am following best practice
completed video recordings from Zoom to S3 and google drive… trigger is a completed recording webhook, so I take the download URL and access token and was previously able to stream that file right to S3, without having to download the file first and then upload it, making it require less memory on each run.
i notice that issue occurs in pipedream sometimes, including when duplicating workflows, that clearing and re-adding the auth resolves the issue… sometimes when i have duplicated a workflow, i run it and it does not work, and despite the auth showing, it need to be cleared and re-added to the workflow step