Can't use a GDrive remote because it reaches quota limits but project says quotas are OK

What is the problem you are having with rclone?

One crypt remote that uses a GDrive backend fails to sync randomly (sometimes fails sometimes it doesn't) with User Rate Limit exceeded even though I've checked the project quotas and it is well within the limits, well... it isn't even close to reach the limits. The source doesn't even have a lot of files, it only has 120K files and I'm using only 25 checkers and 6 transfers for the job, for comparison I have another sync job that has 350K files and uses 30 checkers and 8 transfers and it doesn't fail.

Even so the weirdest thing is that rclone tells me

Error 403: User Rate Limit Exceeded.

And, as I said, when I go check the quotas on the project it isn't even close to them.
I also tried browsing the crypt remote and I can do it without problems.

What is your rclone version (output from rclone version)

1.55.1

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10 x64

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone sync DIRECTORY "CRYPT:DIRECTORY" --delete-after --checksum --verbose --no-update-modtime --transfers 6 --checkers 25 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --delete-excluded --exclude "/$RECYCLE.BIN/**" --exclude "/System Volume Information/**" --create-empty-src-dirs --backup-dir "CRYPT:BACKUP\DIRECTORY" --links --track-renames --track-renames-strategy modtime --drive-chunk-size 256M --stats 1s --stats-file-name-length 0 --fast-list

The rclone config contents with secrets removed.

[DRIVE]
type = drive
client_id = REDACTED
client_secret = REDACTED
scope = drive
token = REDACTED

[CRYPT]
type = crypt
remote = DRIVE:DIR
filename_encryption = standard
directory_name_encryption = true
password = REDACTED
password2 = REDACTED

A log from the command with the -vv flag

2021/06/01 23:07:40 INFO  :
Transferred:   	         0 / 0 Bytes, -, 0 Bytes/s, ETA -
Elapsed time:      7m40.4s

2021/06/01 23:07:41 ERROR : : error reading destination directory: couldn't list directory: googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXX, userRateLimitExceeded
2021/06/01 23:07:41 INFO  : Encrypted drive 'CRYPT:DIRECTORY': Making map for --track-renames
2021/06/01 23:07:41 INFO  : Encrypted drive 'CRYPT:DIRECTORY': Finished making map for --track-renames
2021/06/01 23:07:41 ERROR : Encrypted drive 'CRYPT:DIRECTORY': not deleting files as there were IO errors
2021/06/01 23:07:41 ERROR : Encrypted drive 'CRYPT:DIRECTORY': not deleting directories as there were IO errors
2021/06/01 23:07:41 ERROR : Attempt 3/3 failed with 1 errors and: couldn't list directory: googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXXXXX, userRateLimitExceeded
2021/06/01 23:07:41 INFO  :
Transferred:   	         0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors:                 1 (retrying may help)
Elapsed time:      7m41.4s

2021/06/01 23:07:41 Failed to sync: couldn't list directory: googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXXXXX, userRateLimitExceeded

Tried to do another run with error-level debug but I got the rate limit exceeded at the beggining:

2021/06/01 23:25:02 DEBUG : Creating backend with remote "E:\\"
2021/06/01 23:25:02 DEBUG : local: detected overridden config - adding "{b6816}" suffix to name
2021/06/01 23:25:02 DEBUG : fs cache: renaming cache item "E:\\" to be canonical "local{b6816}://?/E:/"
2021/06/01 23:25:02 DEBUG : Creating backend with remote "CRYPT:DIRECTORY"
2021/06/01 23:25:02 DEBUG : Creating backend with remote "DRIVE:DIR/REDACTED"
2021/06/01 23:25:02 DEBUG : DRIVE: detected overridden config - adding "{7NrfJ}" suffix to name
2021/06/01 23:25:02 DEBUG : Google drive root 'DIR/REDACTED': root_folder_id = "REDACTED" - save this in the config to speed up startup
2021/06/01 23:25:03 DEBUG : fs cache: renaming cache item "DRIVE:DIR/REDACTED" to be canonical "DRIVE{7NrfJ}:DIR/REDACTED"
2021/06/01 23:25:03 DEBUG : fs cache: switching user supplied name "DRIVE:DIR/REDACTED" for canonical name "DRIVE{7NrfJ}:DIR/REDACTED"
2021/06/01 23:25:03 DEBUG : Creating backend with remote "CRYPT:BACKUP\\DIRECTORY"
2021/06/01 23:25:03 DEBUG : Creating backend with remote "DRIVE:DIR/REDACTED/REDACTED"
2021/06/01 23:25:03 DEBUG : DRIVE: detected overridden config - adding "{7NrfJ}" suffix to name
2021/06/01 23:25:03 DEBUG : Google drive root 'DIR/REDACTED/REDACTED': root_folder_id = "REDACTED" - save this in the config to speed up startup
2021/06/01 23:25:04 DEBUG : fs cache: renaming cache item "DRIVE:DIR/REDACTED/REDACTED" to be canonical "DRIVE{7NrfJ}:DIR/REDACTED/REDACTED"
2021/06/01 23:25:04 DEBUG : fs cache: switching user supplied name "DRIVE:DIR/REDACTED/REDACTED" for canonical name "DRIVE{7NrfJ}:DIR/REDACTED/REDACTED"
2021/06/01 23:25:04 DEBUG : fs cache: renaming cache item "CRYPT:BACKUP\\DIRECTORY" to be canonical "CRYPT:BACKUP/DIRECTORY"
2021/06/01 23:25:04 DEBUG : $RECYCLE.BIN: Excluded
2021/06/01 23:25:04 DEBUG : System Volume Information: Excluded
2021/06/01 23:25:05 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXXXXX, userRateLimitExceeded)
2021/06/01 23:25:05 DEBUG : pacer: Rate limited, increasing sleep to 1.640449085s
2021/06/01 23:25:05 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXXXXX, userRateLimitExceeded)
2021/06/01 23:25:05 DEBUG : pacer: Rate limited, increasing sleep to 2.87372753s
2021/06/01 23:25:05 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXXXXX, userRateLimitExceeded)
2021/06/01 23:25:05 DEBUG : pacer: Rate limited, increasing sleep to 4.349705105s
...
...
...
2021/06/01 23:26:07 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXXXXX, userRateLimitExceeded)
2021/06/01 23:26:07 DEBUG : pacer: Rate limited, increasing sleep to 16.193273319s
2021/06/01 23:26:07 DEBUG : pacer: Reducing sleep to 0s
2021/06/01 23:26:08 DEBUG : pacer: low level retry 9/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXXXXX, userRateLimitExceeded)
2021/06/01 23:26:08 DEBUG : pacer: Rate limited, increasing sleep to 1.297397804s
2021/06/01 23:26:08 DEBUG : pacer: Reducing sleep to 0s

Picture of the quotas:

You should use the defaults as you are hitting too many transactions per second as you can only hit about 10 and you have 6 transfers and 25 checkers.

Can you tell me where can I read about the limit per second? All I see is the limit per 100 seconds and I'm far from that. Also, as I said I have other sync jobs with more than double the file amount and more transfers and checkers and they don't fail. Yes, I've setup different projects for each job so they are not using the same credentials.

If you are getting 403s, that's Google telling you to slow down and rclone will retry.

There isn't an exact per second item unfortunately as you only have what's in your quota page to look at.

You are either:

  • not using your own API key (which seems unlikely)
  • using a shared one that hitting a limit
  • have values configured too high

There isn't a single answer unfortunately. Looks like Google has upped the numbers for things as my old screenshots were:

image

Which is an older screenshot, which is why I said ~10 per second.

My new one looks like yours.

image

Which seems to be 100 per second, albeit if you are getting throttled, I would assume the per second number vs the per 100 second number might be something different, but since it isn't documented, it would be some trial and error.

And thanks for asking that question as for mounts we have a 10 TPS default setting which may need to be changed. I auto corrected your screenshot in my head to 10 TPS and noticed after you replied it was 10,000, not 1,000!

I made a new post to see if we want to change the default values.

You are either:

  • not using your own API key (which seems unlikely)
  • using a shared one that hitting a limit
  • have values configured too high
  • Yes, I'm using my own API key.
  • Nope, the job has a unique key (all my jobs have unique API keys with different projects), it isn't shared with any other job, I've double checked on the rclone config file.
  • As I said, I have other remotes that have higher values and higher file amounts and they have never failed. That's why I find this weird, why rclone/google doesn't show those warnings for the other jobs when they -as far as I can tell- make more requests than this one?

I have tried lowering the transfers and checkers but I still don't get good results, job sometimes fails sometimes it doesn't, shouldn't it always fail or always don't fail?
I mean, unless I'm changing a massive amount of files between job executions the job result should be constant, right? Max it changes on the drive are like <100 files.
I guess I will have to try to create another credential and project to see if that works but I can't think of a reason why this API key would be behaving like this.
Couldn't it be that rclone is behaving funny with this remote or local path? I mean, maybe something is failing and it's making rclone do more requests than what it should, or something like that. I don't know, I'm just guessing here, sorry if that sounds silly.

There's really no magic here as if the API is telling you to back off, rclone respects that.

You can always check with Google Support I'd imagine and perhaps they can shed more insight or validate you are indeed using the right user name/key combination.

My understanding that API was limited per user as noted by the middle number so if the user goes above that, it does not matter how many different API key combos you've made.

Yeah, you're right, I should ask Google support about this. I'll see if this happens again, I made a new project and API key and to this moment it hasn't failed, knocking wood.

Thanks for the help.

1 Like