Copy file to same Google drive gives userRateLimitExceeded error

What is the problem you are having with rclone?

I've been trying since yesterday to copy 24k files from one location on my encrypted google drive to another folder on the same drive.
I specify that I'm copying file by file because I'm running a script to create a folder before copying a file into it (I'll give the script used below).
At some point, I get the error I give in the logs below

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.0

  • os/version: Microsoft Windows Server 2019 Standard 1809 (64 bit)
  • os/kernel: 10.0.17763.4010 (x86_64)
  • os/type: windows
  • os/arch: amd64
  • go/version: go1.21.4
  • go/linking: static
  • go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

Encrypted Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Il s'agit dans fichier .bat que j'execute sur mon windows

@echo off
setlocal enabledelayedexpansion

set "configFile=mapping.txt"

for /f "tokens=1,* delims=|" %%a in ('type "%configFile%"') do (
    set "FolderName=%%a"
    set "FileName=%%b"

    echo !FolderName!

    C:\rclone\rclone copy --log-level DEBUG --crypt-server-side-across-configs gcrypt:"Documents\!FileName!" gcrypt:"NewDocuments\!FolderName!" --create-empty-src-dirs

)

endlocal

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[gcrypt]
type = crypt
remote = gdrive:/crypt
password = XXX
password2 = XXX

[gdrive]
type = drive
client_id = XXX
client_secret = XXX
scope = drive
root_folder_id =
token = XXX
team_drive = XXX

A log from the command that you were trying to run with the -vv flag

2024/01/11 21:21:29 DEBUG : pacer: Reducing sleep to 0s
2024/01/11 21:21:46 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2024/01/11 21:21:46 DEBUG : pacer: Rate limited, increasing sleep to 1.623335742s
2024/01/11 21:21:47 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2024/01/11 21:21:47 DEBUG : pacer: Rate limited, increasing sleep to 2.570895605s
2024/01/11 21:21:48 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2024/01/11 21:21:48 DEBUG : pacer: Rate limited, increasing sleep to 4.868950252s
2024/01/11 21:21:51 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2024/01/11 21:21:51 DEBUG : pacer: Rate limited, increasing sleep to 8.380285142s
2024/01/11 21:21:56 DEBUG : pacer: low level retry 5/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2024/01/11 21:21:56 DEBUG : pacer: Rate limited, increasing sleep to 16.895368184s

You are hitting gdrive rate limits...

I would suggest to add --tpslimit 10 --tpslimit-burst 0 to your copy command to throttle your operations. You might have to experiment with --tpslimit value - hard to tell what is right. Some people go as low as 2. Google drive is real slow for many small files unfortunately.

I've tried even 2, which gives me the rate-limit error even faster.
but I don't think it makes any difference in this case because you execute an rclone copy command each time, so it doesn't take the tpslimit into account
I tried with a 5-second timeout in the bat for loop but it was the same thing

does this error indicate a daily upload limit or a google api usage limit?

I see. Bummer.

Have you tried rclone copyto instead of rclone copy? It might be related to copy listing all destination every time (maybe --fast-list could help here)

I do not know. Maybe some other Google users can shed some light here.

I see that copyto does not take into account --create-empty-src-dirs
so I'm afraid it will download and then upload?

the --fast-list using rclone copy doesn't change anything in the end

Not sure what you mean... One has nothing to do with another. In addition you copy single files - why to worry about empty dirs.

ah yes I got confused with --crypt-server-side-across-configs

but then I think copyto is for copying a directory

2024/01/12 09:29:14 ERROR : Encrypted drive 'gcrypt{Db_Y9}:Documents/texte.txt': error reading source root directory: directory not found
2024/01/12 09:29:14 DEBUG : Encrypted drive 'gcrypt{Db_Y9}:NewDocuments/Journal 28': Waiting for checks to finish
2024/01/12 09:29:14 DEBUG : Encrypted drive 'gcrypt{Db_Y9}:NewDocuments/Journal 28': Waiting for transfers to finish
2024/01/12 09:29:14 ERROR : Attempt 3/3 failed with 1 errors and: directory not found

is deprecated.

Always use:

--server-side-across-configs

No idea as you do not post what you are doing.

You should use:

rclone copyto src:path/to/file.txt dst:path/to/file.txt

ok but it doesn't change anything, I've just tested it

here's the current script

@echo off
setlocal enabledelayedexpansion

set "configFile=mapping.txt"

for /f "tokens=1,* delims=|" %%a in ('type "%configFile%"') do (
    set "FolderName=%%a"
    set "FileName=%%b"

    C:\rclone\rclone copy --log-level DEBUG --fast-list --server-side-across-configs gcrypt:"Film\!FileName!" gcrypt:"NewFilm\!FolderName!" --create-empty-src-dirs

    timeout /nobreak /t 5 >nul
)

endlocal

I forgot to mention that there are at least 170to of files to copy.

I found that too, I don't think it's an api limit because 12000 per second max.
https://developers.google.com/drive/api/guides/limits

I think it's the 750GB per day limit, but it's not normal because I have to make the copy on the server side with the --server-side-across-configs command.

Maybe some other gdrive users can give you some recommendations here.

Constant battling with rate limits was one of the issues I faced in the past too - so I gave up with gdrive for my use.

1 Like

image

in any case, I can confirm that it's not an api quota, since I don't even exceed 500 per minute.

what i don't understand is why the copy would use the upload quota. when you do it from the google site, i don't think it uses the upload quota?

if not, there's nothing I can do to exceed the quota. Is it possible that once the max quota is used, it changes to another google account. With an alternation of max 2 accounts.

You'd want to post a bigger log as that would show if it's recopying new files or doing it server side.

It has nothing to do with actual API usage. Both upload/download quotas are lumped in with the API messages. You never really have issues with API quota if you are using your own client ID/secret.

yes I can do it what do you advise me to do logs? because here it's several rclone copy

Just pick one file if that causes the error and post the full log with -vv.

2024/01/12 15:33:30 DEBUG : rclone: Version "v1.65.0" starting with parameters ["C:\\rclone\\rclone" "copy" "-vv" "--low-level-retries" "3" "--ignore-existing" "--drive-stop-on-upload-limit" "--server-side-across-configs" "gcrypt:Film/XOXO 2016 {tmdb-352492}.mkv" "gcrypt:NewFilm/XOXO (2016) {tmdb-352492}" "--create-empty-src-dirs"]
2024/01/12 15:33:30 DEBUG : Creating backend with remote "gcrypt:Film/XOXO 2016 {tmdb-352492}.mkv"
2024/01/12 15:33:30 DEBUG : Using config file from "C:\\Users\\Administrateur\\AppData\\Roaming\\rclone\\rclone.conf"
2024/01/12 15:33:30 DEBUG : Creating backend with remote "gdrive:/crypt/ee80tqt6602po59hul4q7et4mc/6h9htvnk03ot888bj2quuavo2e0nv25kc7irgtc8l59243gr123g"
2024/01/12 15:33:30 DEBUG : gdrive: detected overridden config - adding "{-LSY5}" suffix to name
2024/01/12 15:33:32 DEBUG : fs cache: adding new entry for parent of "gdrive:/crypt/ee80tqt6602po59hul4q7et4mc/6h9htvnk03ot888bj2quuavo2e0nv25kc7irgtc8l59243gr123g", "gdrive{-LSY5}:crypt/ee80tqt6602po59hul4q7et4mc"
2024/01/12 15:33:32 DEBUG : Creating backend with remote "gcrypt:NewFilm/XOXO (2016) {tmdb-352492}"
2024/01/12 15:33:32 DEBUG : Creating backend with remote "gdrive:/crypt/9ace086nc7ofvct19tp30ctgbs/p5012l0i1jm8t4s51f0c5n0d2o5au98abefbid505tnqjvoekf3g"
2024/01/12 15:33:32 DEBUG : gdrive: detected overridden config - adding "{-LSY5}" suffix to name
2024/01/12 15:33:33 DEBUG : fs cache: renaming cache item "gdrive:/crypt/9ace086nc7ofvct19tp30ctgbs/p5012l0i1jm8t4s51f0c5n0d2o5au98abefbid505tnqjvoekf3g" to be canonical "gdrive{-LSY5}:crypt/9ace086nc7ofvct19tp30ctgbs/p5012l0i1jm8t4s51f0c5n0d2o5au98abefbid505tnqjvoekf3g"
2024/01/12 15:33:33 DEBUG : fs cache: switching user supplied name "gdrive:/crypt/9ace086nc7ofvct19tp30ctgbs/p5012l0i1jm8t4s51f0c5n0d2o5au98abefbid505tnqjvoekf3g" for canonical name "gdrive{-LSY5}:crypt/9ace086nc7ofvct19tp30ctgbs/p5012l0i1jm8t4s51f0c5n0d2o5au98abefbid505tnqjvoekf3g"
2024/01/12 15:33:33 DEBUG : XOXO 2016 {tmdb-352492}.mkv: Need to transfer - File not found at Destination
2024/01/12 15:33:35 ERROR : Google drive root 'crypt/9ace086nc7ofvct19tp30ctgbs/p5012l0i1jm8t4s51f0c5n0d2o5au98abefbid505tnqjvoekf3g': Received upload limit error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
2024/01/12 15:33:35 ERROR : XOXO 2016 {tmdb-352492}.mkv: Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
2024/01/12 15:33:35 ERROR : Fatal error received - not attempting retries
2024/01/12 15:33:35 INFO  :
Transferred:              0 B / 0 B, -, 0 B/s, ETA -
Errors:                 1 (fatal error encountered)
Elapsed time:         4.8s

2024/01/12 15:33:35 DEBUG : 7 go routines active
2024/01/12 15:33:35 Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

Little early for me so I missed the obvious.

If you are copying and have no upload quota, it won't work as server side or not, you are doubling the file size.

texter@macmini Downloads % rclone copy /etc/hosts GD:
texter@macmini Downloads % rclone copyto GD:hosts GD:blah -vv
2024/01/12 09:38:43 DEBUG : rclone: Version "v1.65.1" starting with parameters ["rclone" "copyto" "GD:hosts" "GD:blah" "-vv"]
2024/01/12 09:38:43 DEBUG : Creating backend with remote "GD:hosts"
2024/01/12 09:38:43 DEBUG : Using config file from "/Users/texter/.config/rclone/rclone.conf"
2024/01/12 09:38:43 DEBUG : Google drive root 'hosts': 'root_folder_id = 0AGoj85v3xeadUk9PVA' - save this in the config to speed up startup
2024/01/12 09:38:43 DEBUG : fs cache: adding new entry for parent of "GD:hosts", "GD:"
2024/01/12 09:38:44 DEBUG : hosts: Need to transfer - File not found at Destination
2024/01/12 09:38:45 DEBUG : hosts: md5 = a3f51a033f988bc3c16d343ac53bb25f OK
2024/01/12 09:38:45 INFO  : hosts: Copied (server-side copy) to: blah
2024/01/12 09:38:45 INFO  :
Transferred:   	        213 B / 213 B, 100%, 0 B/s, ETA -
Transferred:            1 / 1, 100%
Server Side Copies:     1 @ 213 B
Elapsed time:         1.5s

2024/01/12 09:38:45 DEBUG : 8 go routines active

Whereas move would server side rename it.

texter@macmini Downloads % rclone moveto GD:blah GD:testmove -vv
2024/01/12 09:40:03 DEBUG : rclone: Version "v1.65.1" starting with parameters ["rclone" "moveto" "GD:blah" "GD:testmove" "-vv"]
2024/01/12 09:40:03 DEBUG : Creating backend with remote "GD:blah"
2024/01/12 09:40:03 DEBUG : Using config file from "/Users/texter/.config/rclone/rclone.conf"
2024/01/12 09:40:03 DEBUG : Google drive root 'blah': 'root_folder_id = 0AGoj85v3xeadUk9PVA' - save this in the config to speed up startup
2024/01/12 09:40:03 DEBUG : fs cache: adding new entry for parent of "GD:blah", "GD:"
2024/01/12 09:40:03 DEBUG : blah: Need to transfer - File not found at Destination
2024/01/12 09:40:04 INFO  : blah: Moved (server-side) to: testmove
2024/01/12 09:40:04 INFO  :
Transferred:   	        213 B / 213 B, 100%, 0 B/s, ETA -
Renamed:                1
Transferred:            1 / 1, 100%
Server Side Moves:      1 @ 213 B
Elapsed time:         1.1s

2024/01/12 09:40:04 DEBUG : 8 go routines active