Sending data TO Gdrive: 403: Rate Limit Exceeded


#85

I tried the Google Drive app, it was uploading while rclone giving me 403 error, now it’s stopped too, and it keep saying waiting for drive.

So I’m pretty sure, that has limit too.


#86

After I had been banned, I tried uploading using the google drive website and uploads just failed. So the ban was applied to the account rather than to an api key.

I was just looking for a better angle to approach google, as if you say you are using rclone, then they point to this as the issue and ask loads of developer type questions.

But if you can get banned just using the google drive website, then that takes rclone out of the equation, so they cant point to that as causing the issue.

but tbh, I don’t really think google will ever explain the limits they arbitarily impose


#87

No they won’t, it’s been long time that there is 10Tb for download limit, but there is not single information about it, so don’t think now they come and tell you about this new limit.

But I wish at least it could show you that you can’t upload, like when you get ban for Download, it will show you on the WebUI, but this one nope.


#88

I just hit this too… Here at my stats…

Transferred: 680.871 GBytes (50.290 MBytes/s)
Errors: 1
Checks: 66
Transferred: 66
Elapsed time: 3h51m3.8s


#89

Seeing the limits here as well

Transferred:   770.556 GBytes (47.066 MBytes/s)

Errors: 0
Checks: 43
Transferred: 43
Elapsed time: 4h39m24.6s


#90

I got following from google support

"Thanks for getting back to me. As I dig deeper into this, I was able to find some internal documented information. I’ll be sharing with you what I found.

There is a bandwidth limitation per viewer and per owner, and a limitation on the number of times a document can be viewed. The limits are 10TB/day and 50,000 views/day with bursts up to 900/min (15 QPS) per document. I believe this might be the drive back end limit you are reaching. Also, in June 2017 a quota for creating blobs of 100Gb/day was established. It’s possible to create files of bigger size, but after this quota is exceeded all subsequent blob creation operations will fail.

That is all the information I was able to get and I hope it is useful. If you have any other question around your G Suite account, please reply to this message and I will be happy to follow up with you. In the meantime, the case will continue to remain open."

I am monitoring drive usage in admin console. In three days i have uploaded 2275,55 GB ~ 758,5 GB per day. It really looks like 750 GB plus active uploads when the limit is reached,


#91

Great idea of Google to tell us we can upload files up to 5TB but the daily upload limit is 750gb.

The only answer Google always gives me is to implement exponential backoff, which is implemented in RClone (I already told them)…


#92

Yep, Same here:

2017/08/16 09:35:20 INFO  :
Transferred:   752.903 GBytes (54.210 MBytes/s)
Errors:                 6
Checks:                 0
Transferred:          238
Elapsed time:   3h57m1.8s
Transferring:
 *                                    A.ISO: 75% done, 2.868 MBytes/s, ETA: 6m11s
 *                                    B.mp4:  0% done, 0 Bytes/s, ETA: -
 *                                    C.mp4:  0% done, 0 Bytes/s, ETA: -
 *                                    C.rar:  0% done, 0 Bytes/s, ETA: -

2017/08/16 09:35:20 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2017/08/16 09:35:20 DEBUG : RCT835.ISO: Sending chunk 3439329280 length 8388608
2017/08/16 09:35:21 DEBUG : pacer: Rate limited, sleeping for 1.850126629s (1 consecutive low level retries)
2017/08/16 09:35:21 DEBUG : pacer: low level retry 6/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/08/16 09:35:21 DEBUG : pacer: Rate limited, sleeping for 2.128918395s (2 consecutive low level retries)
2017/08/16 09:35:21 DEBUG : pacer: low level retry 6/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/08/16 09:35:21 DEBUG : pacer: Rate limited, sleeping for 4.80549695s (3 consecutive low level retries)
2017/08/16 09:35:21 DEBUG : pacer: low level retry 5/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/08/16 09:35:21 DEBUG : pacer: Rate limited, sleeping for 8.027351749s (4 consecutive low level retries)
2017/08/16 09:35:21 DEBUG : pacer: low level retry 7/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
2017/08/16 09:35:22 DEBUG : pacer: Resetting sleep to minimum 10ms on success

Pretty much can confirm that the hard cap is 750GB/day


#93

Yes. And max blobsize 100GB. You can upload larger files, but subsequent files will be blocked.


#94

This is what I got from a support guy regarding the upload limitation


Currently, there is a hard limit for write operations (create, update, delete) that is lower than your allowed QPS which is something that cannot be lifted and raising per-user limit will not improve the error rate. These QPS limits represent an upper bound on aggregate API calls and short term bursts are allowed over that limit. With this, I suggest that you consider slowing down on per-user operations then compensate by doing more users in parallel to maximize throughput.

Also, aside from using a Service Account with authority delegation as suggested in this documentation, I would suggest that you consider the following strategies to optimize your app:
Batch API requests - allows your client to put several API calls into a single HTTP request.
Push notifications - if you want to be updated on the changes of the file


#95

Ok, that is interesting. Is it possible to tweak rclone to use fewer API calls or bundle them as the supporter suggests?


#96

--drive-chunk-size and/or --drive-upload-cutoff could be useful for that.


#97

Thanks, @tdaniels - but what is the correlation between chunksize / multipart upload and number of API calls. Do i need to use higher or lower values than default?

I am mostly uploading large files 2-40 GB, so i am thinking large drive-chunk-size and low cutoff?


#98

If it isn’t a GB/day ban I’d speculate that a sufficiently high value could reduce the calls enough to allow you to upload more. I do recall rclone needing as much memory free as the value you set though. In tandem, one could impose a resting period between file uploads to further increase chances. The latter isn’t something I believe rclone can do natively, so one’d have to script something to that effect.


#99

@ncw Any opinion on this?


#100

Okay, i will be trying “–drive-chunk-size 512M --checkers 1 --transfers 1” tonight when the ban is lifted.


#101

With 1 transfer, do you really thing you’ll get to 750G?


#102

@Larskl i’m pretty certain it has nothing to do with API, I had mine increased to 20,000 per 100 seconds and I never get anywhere near that. The highest i’ve ever had was 2,000 and that’s once every few days at most.


#103

Yes @calisro - 9.4MB/s for 24 hours will hit the cap. You need to use bwlimit.


#104

oppiz2’s reply from Google indicates a separate, more hidden, limit: