I've been trying since yesterday to copy 24k files from one location on my encrypted google drive to another folder on the same drive.
I specify that I'm copying file by file because I'm running a script to create a folder before copying a file into it (I'll give the script used below).
At some point, I get the error I give in the logs below
Run the command 'rclone version' and share the full output of the command.
rclone v1.65.0
os/version: Microsoft Windows Server 2019 Standard 1809 (64 bit)
os/kernel: 10.0.17763.4010 (x86_64)
os/type: windows
os/arch: amd64
go/version: go1.21.4
go/linking: static
go/tags: cmount
Which cloud storage system are you using? (eg Google Drive)
Encrypted Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp)
Il s'agit dans fichier .bat que j'execute sur mon windows
@echo off
setlocal enabledelayedexpansion
set "configFile=mapping.txt"
for /f "tokens=1,* delims=|" %%a in ('type "%configFile%"') do (
set "FolderName=%%a"
set "FileName=%%b"
echo !FolderName!
C:\rclone\rclone copy --log-level DEBUG --crypt-server-side-across-configs gcrypt:"Documents\!FileName!" gcrypt:"NewDocuments\!FolderName!" --create-empty-src-dirs
)
endlocal
Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.
I would suggest to add --tpslimit 10 --tpslimit-burst 0 to your copy command to throttle your operations. You might have to experiment with --tpslimit value - hard to tell what is right. Some people go as low as 2. Google drive is real slow for many small files unfortunately.
I've tried even 2, which gives me the rate-limit error even faster.
but I don't think it makes any difference in this case because you execute an rclone copy command each time, so it doesn't take the tpslimit into account
I tried with a 5-second timeout in the bat for loop but it was the same thing
does this error indicate a daily upload limit or a google api usage limit?
Have you tried rclone copyto instead of rclone copy? It might be related to copy listing all destination every time (maybe --fast-list could help here)
I do not know. Maybe some other Google users can shed some light here.
I think it's the 750GB per day limit, but it's not normal because I have to make the copy on the server side with the --server-side-across-configs command.
what i don't understand is why the copy would use the upload quota. when you do it from the google site, i don't think it uses the upload quota?
if not, there's nothing I can do to exceed the quota. Is it possible that once the max quota is used, it changes to another google account. With an alternation of max 2 accounts.
You'd want to post a bigger log as that would show if it's recopying new files or doing it server side.
It has nothing to do with actual API usage. Both upload/download quotas are lumped in with the API messages. You never really have issues with API quota if you are using your own client ID/secret.
If you are copying and have no upload quota, it won't work as server side or not, you are doubling the file size.
texter@macmini Downloads % rclone copy /etc/hosts GD:
texter@macmini Downloads % rclone copyto GD:hosts GD:blah -vv
2024/01/12 09:38:43 DEBUG : rclone: Version "v1.65.1" starting with parameters ["rclone" "copyto" "GD:hosts" "GD:blah" "-vv"]
2024/01/12 09:38:43 DEBUG : Creating backend with remote "GD:hosts"
2024/01/12 09:38:43 DEBUG : Using config file from "/Users/texter/.config/rclone/rclone.conf"
2024/01/12 09:38:43 DEBUG : Google drive root 'hosts': 'root_folder_id = 0AGoj85v3xeadUk9PVA' - save this in the config to speed up startup
2024/01/12 09:38:43 DEBUG : fs cache: adding new entry for parent of "GD:hosts", "GD:"
2024/01/12 09:38:44 DEBUG : hosts: Need to transfer - File not found at Destination
2024/01/12 09:38:45 DEBUG : hosts: md5 = a3f51a033f988bc3c16d343ac53bb25f OK
2024/01/12 09:38:45 INFO : hosts: Copied (server-side copy) to: blah
2024/01/12 09:38:45 INFO :
Transferred: 213 B / 213 B, 100%, 0 B/s, ETA -
Transferred: 1 / 1, 100%
Server Side Copies: 1 @ 213 B
Elapsed time: 1.5s
2024/01/12 09:38:45 DEBUG : 8 go routines active
Whereas move would server side rename it.
texter@macmini Downloads % rclone moveto GD:blah GD:testmove -vv
2024/01/12 09:40:03 DEBUG : rclone: Version "v1.65.1" starting with parameters ["rclone" "moveto" "GD:blah" "GD:testmove" "-vv"]
2024/01/12 09:40:03 DEBUG : Creating backend with remote "GD:blah"
2024/01/12 09:40:03 DEBUG : Using config file from "/Users/texter/.config/rclone/rclone.conf"
2024/01/12 09:40:03 DEBUG : Google drive root 'blah': 'root_folder_id = 0AGoj85v3xeadUk9PVA' - save this in the config to speed up startup
2024/01/12 09:40:03 DEBUG : fs cache: adding new entry for parent of "GD:blah", "GD:"
2024/01/12 09:40:03 DEBUG : blah: Need to transfer - File not found at Destination
2024/01/12 09:40:04 INFO : blah: Moved (server-side) to: testmove
2024/01/12 09:40:04 INFO :
Transferred: 213 B / 213 B, 100%, 0 B/s, ETA -
Renamed: 1
Transferred: 1 / 1, 100%
Server Side Moves: 1 @ 213 B
Elapsed time: 1.1s
2024/01/12 09:40:04 DEBUG : 8 go routines active