Can't copy the files from shared google drive to local storage when we use option --fast-list. Getting excessive userRateLimitExceeded I've attached the logs for comparison. We are using our own service account and we are definitely not hitting any quotas.
Run the command 'rclone version' and share the full output of the command.
rclone v1.58.0
os/version: debian 10.12 (64 bit)
os/kernel: 5.4.0-0.bpo.2-cloud-amd64 (x86_64)
os/type: linux
os/arch: amd64
go/version: go1.17.8
go/linking: static
go/tags: none
Which cloud storage system are you using? (eg Google Drive)
Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp)
Tip: --fast-list isn't always the fastest, it depends on the characteristics of your data. You may see better speed and less pacer issues by replacing --fast-list with something like this: --checkers=16 --drive-pacer-min-sleep=10ms
The problem is that when i use --fat-list option there are only errors about quota, nothing is being copied from that share even if I leave it for hours. It happens with every shared drive ill try. It worked earlier without a problem.
Here are 403s after I've started the rclone with --fast-list :
Thank you. I have tested this and this look's promising. I will explore that!
Comparison:
vanilla rclone took ~6h --checkers=16 --drive-pacer-min-sleep=10ms took ~1.5h
Depends on how you define problem as you are hitting a quota issue as it's trying to get more API hits in than your quota allows.
@Ole suggestion of manipulating your API hits per second is the best way to address that as you are having a quota issue as rclone is sometimes too fast