SA seems not to work

Hey,

i need a little help here. Why do i get userRateLimitExceeded when i should have active serviceaccounts?

What is the problem you are having with rclone?

Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

Run the command 'rclone version' and share the full output of the command.

rclone v1.58.1

  • os/version: ubuntu 21.10 (64 bit)
  • os/kernel: 5.13.0-46-generic (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.17.9
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Gdrive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone move "$FROM" "$Drive01" --min-age 180m --delete-after --log-file=$LOGFILE --drive-service-account-file=/opt/sa/all/$COUNTER.json --max-transfer 700G --config="$HOME"/.config/rclone/rclone.conf --exclude-from="$HOME"/.config/rclone/excludes --drive-chunk-size 64M --tpslimit 5 --drive-acknowledge-abuse=true -v --delete-empty-src-dirs --fast-list --use-mmap --transfers=5 --checkers=4 --drive-pacer-min-sleep=100ms

The rclone config contents with secrets removed.

[tdrive]
client_id = XXXX.apps.googleusercontent.com
client_secret = XXXX
type = drive
token = {"access_token":"XXXX","token_type":"Bearer","refresh_token":"XXXX","expiry":"2022-06-10T20:38:35.026961521+02:00"}
team_drive = XXXX
service_account_file = /opt/sa/all/100.json

[tcrypt]
type = crypt
remote = tdrive:/encrypt
filename_encryption = standard
directory_name_encryption = true
password = XXXX
password2 = XXXX

A log from the command with the -vv flag

2022/06/10 20:37:08 INFO  : Starting transaction limiter: max 5 transactions/s with burst 1
2022/06/10 20:37:08 DEBUG : --min-age 3h0m0s to 2022-06-10 17:37:08.368170922 +0200 CEST m=-10799.974555159
2022/06/10 20:37:08 DEBUG : rclone: Version "v1.58.1" starting with parameters ["rclone" "move" "/home/xxxx/data/local/" "tcrypt:/" "--min-age" "180m" "--delete-after" "--log-file=/home/xxxx/logs/upload/testing.log" "--drive-service-account-file=/opt/sa/all/1.json" "--max-transfer" "700G" "--config=/home/xxxx/.config/rclone/rclone.conf" "--exclude-from=/home/xxxx/.config/rclone/excludes" "--drive-chunk-size" "64M" "--tpslimit" "5" "--drive-acknowledge-abuse=true" "-v" "--delete-empty-src-dirs" "--fast-list" "--use-mmap" "--transfers=5" "--checkers=4" "--drive-pacer-min-sleep=100ms" "-vv"]
2022/06/10 20:37:08 DEBUG : Creating backend with remote "/home/xxxx/data/local/"
2022/06/10 20:37:08 DEBUG : Using config file from "/home/xxxx/.config/rclone/rclone.conf"
2022/06/10 20:37:08 DEBUG : Creating backend with remote "tcrypt:/"
2022/06/10 20:37:08 DEBUG : Creating backend with remote "tdrive:/encrypt"
2022/06/10 20:37:08 DEBUG : tdrive: detected overridden config - adding "{dCLyL}" suffix to name
2022/06/10 20:37:08 DEBUG : fs cache: renaming cache item "tdrive:/encrypt" to be canonical "tdrive{dCLyL}:encrypt"
2022/06/10 20:37:08 DEBUG : fs cache: switching user supplied name "tdrive:/encrypt" for canonical name "tdrive{dCLyL}:encrypt"
2022/06/10 20:37:08 DEBUG : backup: Excluded
2022/06/10 20:37:08 DEBUG : downloads: Excluded
2022/06/10 20:37:10 DEBUG : config.docx: Skipping undecryptable file name: illegal base32 data at input byte 6
2022/06/10 20:37:10 DEBUG : config: Skipping undecryptable file name: illegal base32 data at input byte 6
2022/06/10 20:37:10 DEBUG : fossils: Skipping undecryptable dir name: not a multiple of blocksize
2022/06/10 20:37:10 DEBUG : snapshots: Skipping undecryptable dir name: illegal base32 data at input byte 9
2022/06/10 20:37:10 DEBUG : chunks: Skipping undecryptable dir name: illegal base32 data at input byte 6
2022/06/10 20:37:10 DEBUG : downloads: Excluded from sync (and deletion)
2022/06/10 20:37:10 DEBUG : backup: Excluded from sync (and deletion)
2022/06/10 20:37:10 DEBUG : downloads/UPLOAD: Excluded from sync (and deletion)
2022/06/10 20:37:11 DEBUG : downloads/UPLOAD/SB: Excluded from sync (and deletion)
2022/06/10 20:37:11 DEBUG : downloads/UPLOAD/TL: Excluded from sync (and deletion)
2022/06/10 20:37:11 DEBUG : backup/BACKUP_20_05_2022: Excluded from sync (and deletion)
2022/06/10 20:37:11 DEBUG : backup/eggdrop: Excluded from sync (and deletion)

There's no error in your log files other than you seem to be mixing crypt and non crypt data together.

How do you mean?

I shared in the log above.

Those lines indicate you have non encrypted content mixed with crypted content.

But that does not have anything to do with the issue, it dosn't cycle the sa users

What issue? Your log file doesn't have any errors other than the one I shared above.

You'd have to show me what you are seeing with a log file as I can't see your computer :slight_smile:

Hey!

This is the only thing i have that says error

2022/06/10 21:19:30 ERROR : XXXX: Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
2022/06/10 21:19:30 ERROR : XXXX: Not deleting source as copy failed: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

Can you share the full command / log with that?

There's nothing that stops rate limiting from Google regardless if you have a user account setup or a service account. If you are seeing any rate limit errors, Google is telling to slow down.

From what i can see, it start working again, don't know if gdrive had a hickup

2022/06/10 21:25:23 INFO  : XXXX: Copied (new)
2022/06/10 21:25:23 INFO  : XXXX: Deleted

Unsure as without a full log, I really can't offer much as the snippets just show the issue without any context around it.

It seems that you are testing atm. So maybe you start always with the same service account and reach the limit. The limit is reseted at UTC 00:00 i think.

Stop start with $COUNTER=1, switch to 10 or so....

Furthermore check the return value of rclone, it should be 8 for the --max-transfer flag.

example

rclone move ...
rcloneExitCode=$?
echo $rcloneExitCode

Hey,

thanks for the reply. I don't quit follow what do you mean by it should be 8 for the --max-transfer flag

Source: Documentation

Rclone will stop transferring when it has reached the size specified. Defaults to off.

When the limit is reached all transfers will stop immediately.

Rclone will exit with exit code 8 if the transfer limit is reached.

You are confusing rate limiting with the daily quotas as that flag does nothing for rate limiting.

A full log would shed more details…

Are you sure?

Furthermore @Rustvogn you should use the flag mentioned in the link

--drive-stop-on-upload-limit

this detects the upload limit and no by the data you uploaded with this run.

Am I sure that you cannot tell without a full log to know if the OP is mixing up errors? Yes, I'm 100% sure as that's why we ask for a full debug log so we don't play guessing games. Generally that error is around rate limits as that's why I want to see what command the OP is actually running that generates a log.

The OP is already mixing crypt and unencrypted data together so I'd imagine there are more errors in the setup and not hitting the actual daily limit, but over and over, that's why we always for a full log so we can answered based on data rather than assumptions.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.