"How Can I Successfully Upsert 2000 Rows into a Google Sheet Using Pipedream?"

This topic was automatically generated from Slack. You can find the original thread here.

Hey All, Having a really tough time getting Pipedream to upsert around 2000 rows into a Google Sheet. There are timeouts, server errors, and more. Have adjusted the pipe settings so that it should effectively be less than 60/minute. (which by itself would take forever).

Are the timeouts on the Google sheets side? We have seen lots of issues with Google rate limits but your <60 per minute should be under their limits.

Yes.

{"error":{"code":429,"message":"Quota exceeded for quota metric 'Write requests' and limit 'Write requests per minute per user' of service '[sheets.googleapis.com](http://sheets.googleapis.com)' for consumer 'project_number:342853642011'.","status":"RESOURCE_EXHAUSTED","details":[{"@type":"[type.googleapis.com/google.rpc.ErrorInfo](http://type.googleapis.com/google.rpc.ErrorInfo)","reason":"RATE_LIMIT_EXCEEDED","domain":"[googleapis.com](http://googleapis.com)","metadata":{"consumer":"projects/342853642011","service":"[sheets.googleapis.com](http://sheets.googleapis.com)","quota_limit_value":"60","quota_limit":"WriteRequestsPerMinutePerUser","quota_metric":"[sheets.googleapis.com/write_requests](http://sheets.googleapis.com/write_requests)","quota_location":"global"}},{"@type":"[type.googleapis.com/google.rpc.Help](http://type.googleapis.com/google.rpc.Help)","links":[{"description":"Request a higher quota limit.","url":"https://cloud.google.com/docs/quota#requesting_higher_quota"}]}]}} 

That was the error

Yes, it sounds like you are exceeding their rate limits.

On the workflow setting I did the following, changed the timeout to 180 seconds, Execution Throttle, limit: 50, every 1minute

The following table details the request limits. Provided that you stay within the per-minute quotas, there’s no limit to the number of requests that you can make per day.

Limit concurrecny 5 workers and a queue size of 999

I looked at the same page, which is how I came up with 50

had a little bit of a safety net.

sorry queue size was 9999.

you want to throttle workflow executions under their limit if your workflow is hitting Google APIs once per execution

I had done that, by setting it at 50, which should be 10 under.

Maybe I will change it to 25 every minute.

and make sure your worklfow only hits them once

And change the workers concurrency to 1 so it will go extra slow.

It is using the upsert, which I think will do a read, and then the write.

Ok I am going to give it a try at 30 every minute, with a worker concurrency of 1

Should only take it an hour and change to run