Best flags for rclone server side copy on google drive?

What is the problem you are having with rclone?

I set a script that syncs my drives every day. My drive is kinda big and it takes a long time for the drive to completely check all the files.

What is your rclone version (output from rclone version)

rclone v1.53.3

  • os/arch: windows/amd64
  • go version: go1.15.5

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone sync "Archive:" "Archive Backup:" --drive-server-side-across-configs -v```

The rclone config contents with secrets removed.

Default config for both Shared Drives

That really doesn't mean much as big can be anything to anyone.

Same thing here as it doesn't give any meaning.

Can you paste what you go with the secrets and such removed?

What does the debug log say when it runs? Can you share that as that really is the most useful thing to post as it'll say why.

1 Like

Sorry for the lack of details! I ran rclone size "Archive:" and I got:

Here's the Config File for "Archive:"

[Archive]
type = drive
client_id = OMITTED.apps.googleusercontent.com
client_secret = OMITTED
token = {"access_token":"OMITTED","token_type":"Bearer","refresh_token":"OMITTED","expiry":"2020-12-21T18:05:25.3191196-05:00"}
team_drive = OMITTED
root_folder_id = 
[Archive Backup]
type = drive
client_id = OMITTED.apps.googleusercontent.com
client_secret = OMITTED
token = {"access_token":"OMITTED","token_type":"Bearer","refresh_token":"OMITTED","expiry":"2020-12-21T18:38:19.4043651-05:00"}
team_drive = OMITTED
root_folder_id = 

The debug log for rclone sync would contain a ton of my personal files. Since I sync the drives each day there is usually only a 1 or 2 file difference between the drives. Even when there is no difference at all the drives takes around 2 hours to finish. I was wondering if there is are certain flags that would help reduce the time it takes to check between the two shared drives.

hi,

have you tried https://rclone.org/drive/#fast-list
as that might prevent this userRateLimitExceeded

1 Like

Do you see your API key being used as well in the console? Are there hits?

1 Like

Whenever I use the command "rclone sync "Archive:" "Archive Backup" -v --drive-server-side-across configs -vv every ~5 seconds I see this error:

2020/12/22 10:08:48 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=OMITTED, userRateLimitExceeded)
2020/12/22 10:08:48 DEBUG : pacer: Rate limited, increasing sleep to 1.639426336s
2020/12/22 10:08:48 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=OMITTED, userRateLimitExceeded)
2020/12/22 10:08:48 DEBUG : pacer: Rate limited, increasing sleep to 1.443809885s
2020/12/22 10:08:48 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=OMITTED, userRateLimitExceeded)
2020/12/22 10:08:48 DEBUG : pacer: Rate limited, increasing sleep to 2.846278608s
2020/12/22 10:08:48 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=OMITTED, userRateLimitExceeded)
2020/12/22 10:08:48 DEBUG : pacer: Rate limited, increasing sleep to 2.144931248s
2020/12/22 10:08:48 DEBUG : pacer: Reducing sleep to 0s

Here's a screenshot of my console: https://imgur.com/a/YeSJS0Z

The two almost empty days are because I canceled the command those two days. And the recent two days where it is higher than usual is because I ran some tests.

Would ~2 hours be the normal amount of time it would take for rclone to check 2 identical drives for rclone sync?

I the --fast-list flag and I don't think it made a difference. It still took around ~2 hours

Are you seeing the credentials though being used in the API console?

felix@gemini:~$ rclone size GD: -vv --fast-list
2020/12/22 10:23:31 DEBUG : rclone: Version "v1.53.3" starting with parameters ["rclone" "size" "GD:" "-vv" "--fast-list"]
2020/12/22 10:23:31 DEBUG : Using config file from "/opt/rclone/rclone.conf"
2020/12/22 10:23:31 DEBUG : Creating backend with remote "GD:"
2020/12/22 10:24:20 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2020/12/22 10:24:20 DEBUG : pacer: Rate limited, increasing sleep to 1.834962098s
2020/12/22 10:24:20 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2020/12/22 10:24:20 DEBUG : pacer: Rate limited, increasing sleep to 2.999944428s
2020/12/22 10:24:20 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=286927808882, userRateLimitExceeded)
2020/12/22 10:24:20 DEBUG : pacer: Rate limited, increasing sleep to 4.439521088s
2020/12/22 10:24:21 DEBUG : pacer: Reducing sleep to 0s
Total objects: 50053
Total size: 139.133 TBytes (152978614955515 Bytes)
2020/12/22 10:24:24 DEBUG : 18 go routines active

You really don't have that many items and a few API backoffs are normal as it's just rclone doing a good job as using the API and it has to back off sometimes.

If they are two different drives and you are syncing, you could be hitting the upload limits per day and that would be in the debug log. If you can share the log, we can see the issue rather than just guessing.

1 Like

What should I look for in the debug to see if it I'm hitting the upload limits per day? The sync still takes a 2 hours when there isn't anything that is actually uploaded.

image

Also I tried using --fast-list. Is that suppose to save time? The check time is exactly the same for me with and without the flag

If you can include a full debug log as that's what we need to analyze what's going on.

No, the quota you are looking it as just for your API usage. Upload/Download quotas are not accessible nor is much published on specifically what they are (minus the 750GB daily upload).

1 Like

if you run the script once a day, then you could add a flag like --max-age=1d

1 Like

@asdffdsa & @Animosity022

I ran the sync again with --fast-list again this time and it only took 17 mins this time so I guess it works now. I'm not sure why. Maybe because I was testing too much yesterday. Thanks for all your help!

mabye try to increase --checkers
https://rclone.org/drive/#fast-list

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.