Pacer adaptively increases sleep upon 403, then immediately decreases it to 0

What is the problem you are having with rclone?

I'm getting a 403 error (related to pacer-max-sleep?), which the client seems to be correcting automatically, but maybe not as expected. eg:

2020-11-09 21:31:15 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2020-11-09 21:31:15 DEBUG : pacer: Rate limited, increasing sleep to 1.091143772s
2020-11-09 21:31:16 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2020-11-09 21:31:16 DEBUG : pacer: Rate limited, increasing sleep to 2.458812398s
2020-11-09 21:31:17 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2020-11-09 21:31:17 DEBUG : pacer: Rate limited, increasing sleep to 4.827000165s
2020-11-09 21:31:20 DEBUG : pacer: Reducing sleep to 0s

What is your rclone version (output from rclone version)

rclone v1.47.0
- os/arch: linux/amd64
- go version: go1.12.4

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Linux amd64

Which cloud storage system are you using?

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy -P -vv --size-only data/ready foo:bar --log-file logs/rclone.log

The rclone config contents with secrets removed.

[foo]
type = drive
client_id = <client id>
client_secret = <client secret>
scope = drive
token = {"access_token":<access token>,"token_type":"Bearer","refresh_token":<refresh token>,"expiry":"2020-11-09 <time>"}

A log from the command with the -vv flag

2020/11/09 21:08:57 DEBUG : rclone: Version "v1.47.0" starting with parameters ["rclone" "copy" "-P" "-vv" "--log-file" "logs/rclone.log" "--size-only" "data/ready" "foo:bar"]
2020/11/09 21:08:57 DEBUG : Using config file from "/home/me/.config/rclone/rclone.conf"
2020/11/09 21:08:58 DEBUG : .sync/ID: Sizes identical
2020/11/09 21:08:58 DEBUG : .sync/ID: Unchanged skipping
2020/11/09 21:08:58 DEBUG : .sync/IgnoreList: Sizes identical
2020/11/09 21:08:58 DEBUG : .sync/IgnoreList: Unchanged skipping
2020/11/09 21:08:58 DEBUG : .sync/StreamsList: Sizes identical
2020/11/09 21:08:58 DEBUG : .sync/StreamsList: Unchanged skipping
2020/11/09 21:08:58 DEBUG : Media/foo/bar.mp4: Sizes identical
2020/11/09 21:08:58 DEBUG : Media/foo/bar.mp4: Unchanged skipping
2020/11/09 21:08:59 DEBUG : Media/.sync/ID: Sizes identical
2020/11/09 21:08:59 DEBUG : Media/.sync/StreamsList: Sizes identical
2020/11/09 21:08:59 DEBUG : Media/.sync/StreamsList: Unchanged skipping
2020/11/09 21:08:59 DEBUG : Media/.sync/ID: Unchanged skipping
2020/11/09 21:08:59 DEBUG : Media/.sync/IgnoreList: Sizes identical
2020/11/09 21:08:59 DEBUG : Media/.sync/IgnoreList: Unchanged skipping
2020/11/09 21:08:59 DEBUG : Media/tutorial/foo.avi: Sizes identical
2020/11/09 21:08:59 DEBUG : Media/tutorial/foo.avi: Unchanged skipping

< about 500 lines of stuff unchanged>

2020/11/09 21:09:01 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2020/11/09 21:09:01 DEBUG : pacer: Rate limited, increasing sleep to 1.219406848s
2020/11/09 21:09:01 DEBUG : pacer: Reducing sleep to 0s

hello and welcome to the forum.

that version of rclone is 18 months old.
you should update and test again
https://rclone.org/downloads/#script-download-and-install

Thanks @asdffdsa, I downloaded and ran the 1.53 binary from ~/ on my VPS and still got the same behavior. Although I didn't have time to fully read the documentation, the following very conservative settings worked and seem to avoid 403:

~/sbin/rclone copy -P -vv --size-only --tpslimit=3 --transfers=1 --checkers=1 ~/stuff foo:bar

I'd appreciate any additional comments on how these and other settings are typically optimized when uploading to gdrive, note that I've never had this issue when downloading stuff

ok. always good to update first then test.

as for the best settings, we have many gdrive experts, that should be able to comment.

This is a normal error and expected behavior. I think you can try to add the --tpslimit 10 argument to your command and see if that helps. But, basically this is standard behavior. Search through this forum for userRateLimitExceeded and you will find a bunch of threads on this.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.