The debug log for rclone sync would contain a ton of my personal files. Since I sync the drives each day there is usually only a 1 or 2 file difference between the drives. Even when there is no difference at all the drives takes around 2 hours to finish. I was wondering if there is are certain flags that would help reduce the time it takes to check between the two shared drives.
Whenever I use the command "rclone sync "Archive:" "Archive Backup" -v --drive-server-side-across configs -vv every ~5 seconds I see this error:
2020/12/22 10:08:48 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=OMITTED, userRateLimitExceeded)
2020/12/22 10:08:48 DEBUG : pacer: Rate limited, increasing sleep to 1.639426336s
2020/12/22 10:08:48 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=OMITTED, userRateLimitExceeded)
2020/12/22 10:08:48 DEBUG : pacer: Rate limited, increasing sleep to 1.443809885s
2020/12/22 10:08:48 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=OMITTED, userRateLimitExceeded)
2020/12/22 10:08:48 DEBUG : pacer: Rate limited, increasing sleep to 2.846278608s
2020/12/22 10:08:48 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=OMITTED, userRateLimitExceeded)
2020/12/22 10:08:48 DEBUG : pacer: Rate limited, increasing sleep to 2.144931248s
2020/12/22 10:08:48 DEBUG : pacer: Reducing sleep to 0s
The two almost empty days are because I canceled the command those two days. And the recent two days where it is higher than usual is because I ran some tests.
Would ~2 hours be the normal amount of time it would take for rclone to check 2 identical drives for rclone sync?
Are you seeing the credentials though being used in the API console?
felix@gemini:~$ rclone size GD: -vv --fast-list
2020/12/22 10:23:31 DEBUG : rclone: Version "v1.53.3" starting with parameters ["rclone" "size" "GD:" "-vv" "--fast-list"]
2020/12/22 10:23:31 DEBUG : Using config file from "/opt/rclone/rclone.conf"
2020/12/22 10:23:31 DEBUG : Creating backend with remote "GD:"
2020/12/22 10:24:20 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2020/12/22 10:24:20 DEBUG : pacer: Rate limited, increasing sleep to 1.834962098s
2020/12/22 10:24:20 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2020/12/22 10:24:20 DEBUG : pacer: Rate limited, increasing sleep to 2.999944428s
2020/12/22 10:24:20 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2020/12/22 10:24:20 DEBUG : pacer: Rate limited, increasing sleep to 4.439521088s
2020/12/22 10:24:21 DEBUG : pacer: Reducing sleep to 0s
Total objects: 50053
Total size: 139.133 TBytes (152978614955515 Bytes)
2020/12/22 10:24:24 DEBUG : 18 go routines active
You really don't have that many items and a few API backoffs are normal as it's just rclone doing a good job as using the API and it has to back off sometimes.
If they are two different drives and you are syncing, you could be hitting the upload limits per day and that would be in the debug log. If you can share the log, we can see the issue rather than just guessing.
What should I look for in the debug to see if it I'm hitting the upload limits per day? The sync still takes a 2 hours when there isn't anything that is actually uploaded.
Also I tried using --fast-list. Is that suppose to save time? The check time is exactly the same for me with and without the flag
If you can include a full debug log as that's what we need to analyze what's going on.
No, the quota you are looking it as just for your API usage. Upload/Download quotas are not accessible nor is much published on specifically what they are (minus the 750GB daily upload).
I ran the sync again with --fast-list again this time and it only took 17 mins this time so I guess it works now. I'm not sure why. Maybe because I was testing too much yesterday. Thanks for all your help!