What settings should I use when using cryptcheck on Google Drive?

I have a small script that uploads files to GDrive every so often. These files are big (usually in the 500MB-1GB range), and I never upload more than 100 files at the time. When I’m done uploading these files, I like to use cryptcheck to ensure everything’s OK.

The uploading script:

rclone -vv --transfers 2 --exclude "*.DS_Store" copy __adf AibanezKautschMe_ADF_Crypt:

The cryptcheck script:

rclone -vv --exclude "*.DS_Store" cryptcheck __adf AibanezKautschMe_ADF_Crypt: --one-way

This cryptcheck however returns lots of userRateLimitExceeded errors.

Sample of my log:

2019/02/13 23:03:02 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:02 DEBUG : pacer: Rate limited, sleeping for 1.2115473s (1 consecutive low level retries)
2019/02/13 23:03:02 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:02 DEBUG : pacer: Rate limited, sleeping for 2.14432588s (2 consecutive low level retries)
2019/02/13 23:03:02 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:02 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:02 DEBUG : pacer: Rate limited, sleeping for 1.064390113s (1 consecutive low level retries)
2019/02/13 23:03:02 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:04 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:05 DEBUG : pacer: Rate limited, sleeping for 1.509624647s (1 consecutive low level retries)
2019/02/13 23:03:05 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:05 DEBUG : pacer: Rate limited, sleeping for 2.029577887s (2 consecutive low level retries)
2019/02/13 23:03:05 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:05 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:06 DEBUG : pacer: Rate limited, sleeping for 1.559517925s (1 consecutive low level retries)
2019/02/13 23:03:06 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:06 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:08 DEBUG : pacer: Rate limited, sleeping for 1.679226397s (1 consecutive low level retries)
2019/02/13 23:03:08 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:08 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:10 DEBUG : pacer: Rate limited, sleeping for 1.81853171s (1 consecutive low level retries)
2019/02/13 23:03:10 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:10 DEBUG : pacer: Rate limited, sleeping for 2.32005606s (2 consecutive low level retries)
2019/02/13 23:03:10 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:10 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:10 DEBUG : pacer: Rate limited, sleeping for 1.734302976s (1 consecutive low level retries)
2019/02/13 23:03:10 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:10 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:12 DEBUG : pacer: Rate limited, sleeping for 1.605684977s (1 consecutive low level retries)
2019/02/13 23:03:12 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:12 DEBUG : pacer: Resetting sleep to minimum 10ms on success
2019/02/13 23:03:15 DEBUG : pacer: Rate limited, sleeping for 1.480390279s (1 consecutive low level retries)
2019/02/13 23:03:15 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=154502612287, userRateLimitExceeded)
2019/02/13 23:03:15 DEBUG : pacer: Resetting sleep to minimum 10ms on success

And I think this is taking it too long to verify the files. It should verify less than 100 files right now (87, to be exact), and I’d hope that to be relatively quick since they are big files and what not, but it’s been running for 33 minutes now. The userRateLimitedExceed error pints a lot, and it has been trying to verify 87/87 of files for a very long time now. So yeah, for whatever reason the last file itself is causing some kind of bottleneck.

I searched around for this problem and I can only find Github issues about the problem and the problem shoulda been solved on rclone itself, but it’s happening a lot to me.

I am using my own API key for this, so I know this is not a problem with the key used by default.

So what should I limit? Is there a recommended amount of checkers for GDrive or anything similar to that?

EDIT: I tried setting it to four checkers/ The problem persists. I have no clue what settings I could use to make it check 87 big files in a reasonable amount of time. And worst part this is the beginning, then I want to check much bigger directories which include both big and little files.

EDIT: Lowered it to two checkers and I no longer see rate exceeded messages, but I’m not sure, this may be a little too slow but I guess time will tell.

You can’t get more than 10 transactions per second so you have to find that sweet spot for what you are doing.

That counts for all your traffic.

So I have to modify transfers even when I am not explicitly downloading or uploading anything?

You’d probably want to turn down checkers I would think with cryptcheck.

Yeah I lowered the checkers to two in my last test and I no longer see the userRateExceeded errors but it’s taking more or less the same time.

What I have noticed is that it seems to be a problem with the number of folders of and files it’s checking. I have many folders (probably about 60 right now) and most of them have 2 files only. Some exceptional ones have 4 and they are being verified quicker.

You’d want to move the # around to find the best bang for the buck. A retry here or there wouldn’t be bad, but a lot of retries would make it slower.

Rclone 1.46 may work better as it uses an internal rate limiter to try to match google’s one - can you try that?

I’ll upgrade rclone as soon as my current task finishes and do a cryptcheck with it.