Sending data TO Gdrive: 403: Rate Limit Exceeded

It seems like the limit is 750GB but the uploads in progress are allowed to finish with thottled speed.

Google has never acknowledged any limits before, so i am not holding my breath.

I asked Google support and they answered:

“In the case that you see that the limit is not reached inside the Developer Console, then you are hitting drive backend limit. This limits are calculated with an algorithm to protect our system from abuse. I’d suggest to implement an exponential backoff solution as detailed here https://developers.google.com/drive/v3/web/handle-errors#exponential-backoff and experiment until you can find a balance that would suite your app needs and the server needs.”

and

"Unfortunately, there is no way to find out what the internal limit is. As explained before, this issues are best addressed by Google Engineers in stackoverflow. Here is the official answer provided by Nivco, Google Developer Advocate https://stackoverflow.com/questions/10311969/what-is-the-limit-on-google-drive-api-usage."

So has anyone tried uploading more than around 800GB using the google drive web page?
Curious if this also results in a temp upload ban.

I would try it, but my crap uk upload speeds would probably not be able to upload enough to cause a ban.

If this does result in a ban, we could go back to google and say, hey, I was using your web app and it wont let me upload any more. That way at least they could not palm us off with a load of developer specific questions.

I tried the Google Drive app, it was uploading while rclone giving me 403 error, now it’s stopped too, and it keep saying waiting for drive.

So I’m pretty sure, that has limit too.

After I had been banned, I tried uploading using the google drive website and uploads just failed. So the ban was applied to the account rather than to an api key.

I was just looking for a better angle to approach google, as if you say you are using rclone, then they point to this as the issue and ask loads of developer type questions.

But if you can get banned just using the google drive website, then that takes rclone out of the equation, so they cant point to that as causing the issue.

but tbh, I don’t really think google will ever explain the limits they arbitarily impose

No they won’t, it’s been long time that there is 10Tb for download limit, but there is not single information about it, so don’t think now they come and tell you about this new limit.

But I wish at least it could show you that you can’t upload, like when you get ban for Download, it will show you on the WebUI, but this one nope.

I just hit this too… Here at my stats…

Transferred: 680.871 GBytes (50.290 MBytes/s)
Errors: 1
Checks: 66
Transferred: 66
Elapsed time: 3h51m3.8s

Seeing the limits here as well

Transferred:   770.556 GBytes (47.066 MBytes/s)

Errors: 0
Checks: 43
Transferred: 43
Elapsed time: 4h39m24.6s

I got following from google support

"Thanks for getting back to me. As I dig deeper into this, I was able to find some internal documented information. I’ll be sharing with you what I found.

There is a bandwidth limitation per viewer and per owner, and a limitation on the number of times a document can be viewed. The limits are 10TB/day and 50,000 views/day with bursts up to 900/min (15 QPS) per document. I believe this might be the drive back end limit you are reaching. Also, in June 2017 a quota for creating blobs of 100Gb/day was established. It’s possible to create files of bigger size, but after this quota is exceeded all subsequent blob creation operations will fail.

That is all the information I was able to get and I hope it is useful. If you have any other question around your G Suite account, please reply to this message and I will be happy to follow up with you. In the meantime, the case will continue to remain open."

I am monitoring drive usage in admin console. In three days i have uploaded 2275,55 GB ~ 758,5 GB per day. It really looks like 750 GB plus active uploads when the limit is reached,

Great idea of Google to tell us we can upload files up to 5TB but the daily upload limit is 750gb.

The only answer Google always gives me is to implement exponential backoff, which is implemented in RClone (I already told them)…

Yep, Same here:

2017/08/16 09:35:20 INFO  :
Transferred:   752.903 GBytes (54.210 MBytes/s)
Errors:                 6
Checks:                 0
Transferred:          238
Elapsed time:   3h57m1.8s
Transferring:
 *                                    A.ISO: 75% done, 2.868 MBytes/s, ETA: 6m11s
 *                                    B.mp4:  0% done, 0 Bytes/s, ETA: -
 *                                    C.mp4:  0% done, 0 Bytes/s, ETA: -
 *                                    C.rar:  0% done, 0 Bytes/s, ETA: -

2017/08/16 09:35:20 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2017/08/16 09:35:20 DEBUG : RCT835.ISO: Sending chunk 3439329280 length 8388608
2017/08/16 09:35:21 DEBUG : pacer: Rate limited, sleeping for 1.850126629s (1 consecutive low level retries)
2017/08/16 09:35:21 DEBUG : pacer: low level retry 6/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/08/16 09:35:21 DEBUG : pacer: Rate limited, sleeping for 2.128918395s (2 consecutive low level retries)
2017/08/16 09:35:21 DEBUG : pacer: low level retry 6/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/08/16 09:35:21 DEBUG : pacer: Rate limited, sleeping for 4.80549695s (3 consecutive low level retries)
2017/08/16 09:35:21 DEBUG : pacer: low level retry 5/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/08/16 09:35:21 DEBUG : pacer: Rate limited, sleeping for 8.027351749s (4 consecutive low level retries)
2017/08/16 09:35:21 DEBUG : pacer: low level retry 7/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/08/16 09:35:22 DEBUG : pacer: Resetting sleep to minimum 10ms on success

Pretty much can confirm that the hard cap is 750GB/day

Yes. And max blobsize 100GB. You can upload larger files, but subsequent files will be blocked.

This is what I got from a support guy regarding the upload limitation


Currently, there is a hard limit for write operations (create, update, delete) that is lower than your allowed QPS which is something that cannot be lifted and raising per-user limit will not improve the error rate. These QPS limits represent an upper bound on aggregate API calls and short term bursts are allowed over that limit. With this, I suggest that you consider slowing down on per-user operations then compensate by doing more users in parallel to maximize throughput.

Also, aside from using a Service Account with authority delegation as suggested in this documentation, I would suggest that you consider the following strategies to optimize your app:
Batch API requests - allows your client to put several API calls into a single HTTP request.
Push notifications - if you want to be updated on the changes of the file

Ok, that is interesting. Is it possible to tweak rclone to use fewer API calls or bundle them as the supporter suggests?

--drive-chunk-size and/or --drive-upload-cutoff could be useful for that.

1 Like

Thanks, @tdaniels - but what is the correlation between chunksize / multipart upload and number of API calls. Do i need to use higher or lower values than default?

I am mostly uploading large files 2-40 GB, so i am thinking large drive-chunk-size and low cutoff?

If it isn’t a GB/day ban I’d speculate that a sufficiently high value could reduce the calls enough to allow you to upload more. I do recall rclone needing as much memory free as the value you set though. In tandem, one could impose a resting period between file uploads to further increase chances. The latter isn’t something I believe rclone can do natively, so one’d have to script something to that effect.

@ncw Any opinion on this?

Okay, i will be trying “–drive-chunk-size 512M --checkers 1 --transfers 1” tonight when the ban is lifted.

With 1 transfer, do you really thing you’ll get to 750G?

@Larskl i’m pretty certain it has nothing to do with API, I had mine increased to 20,000 per 100 seconds and I never get anywhere near that. The highest i’ve ever had was 2,000 and that’s once every few days at most.