What is the problem you are having with rclone?
Rclone skips empty folders which were created from previous interrupted transfers and continues with next folders instead.
Run the command 'rclone version' and share the full output of the command.
1.64.2
Which cloud storage system are you using? (eg Google Drive)
Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
rclone copy GDrive:"Movies/Dubbed Movies" "D:\Downloads\GDrive NAS backup\rclone-v1.64.2-windows-amd64\Media\Dubbed Movies" --progress --fast-list --drive-acknowledge-abuse --retries=5 --transfers=5 --checkers=10 --tpslimit=10 --order-by ascending -vv
Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.
[GDrive]
type = drive
client_id = XXX
client_secret = XXX
root_folder_id = XXX
token = XXX
A log from the command that you were trying to run with the -vv
flag
2023/11/19 19:22:59 INFO : Starting transaction limiter: max 10 transactions/s with burst 1
2023/11/19 19:22:59 DEBUG : rclone: Version "v1.64.2" starting with parameters ["rclone" "copy" "GDrive:Movies/Dubbed Movies" "D:\\Downloads\\GDrive NAS backup\\rclone-v1.64.2-windows-amd64\\Media\\Dubbed Movies" "--progress" "--fast-list" "--drive-acknowledge-abuse" "--retries=5" "--transfers=5" "--checkers=10" "--tpslimit=10" "--order-by" "ascending" "-vv"]
2023/11/19 19:22:59 DEBUG : Creating backend with remote "GDrive:Movies/Dubbed Movies"
2023/11/19 19:22:59 DEBUG : Using config file from "C:\\Users\\NightMean\\.config\\rclone\\rclone.conf"
2023/11/19 19:22:59 DEBUG : GDrive: detected overridden config - adding "{P2dbo}" suffix to name
2023/11/19 19:23:00 DEBUG : fs cache: renaming cache item "GDrive:Movies/Dubbed Movies" to be canonical "GDrive{P2dbo}:Movies/Dubbed Movies"
2023/11/19 19:23:00 DEBUG : Creating backend with remote "D:\\Downloads\\GDrive NAS backup\\rclone-v1.64.2-windows-amd64\\Media\\Dubbed Movies"
2023/11/19 19:23:00 DEBUG : fs cache: renaming cache item "D:\\Downloads\\GDrive NAS backup\\rclone-v1.64.2-windows-amd64\\Media\\Dubbed Movies" to be canonical "//?/D:/Downloads/GDrive NAS backup/rclone-v1.64.2-windows-amd64/Media/Dubbed Movies"
2023/11/19 19:23:00 ERROR : Fatal error received - not attempting retries
Transferred: 0 B / 0 B, -, 0 B/s, ETA -
Errors: 1 (fatal error encountered)
Elapsed time: 0.7s
2023/11/19 19:23:00 INFO :
Transferred: 0 B / 0 B, -, 0 B/s, ETA -
Errors: 1 (fatal error encountered)
Elapsed time: 0.7s
2023/11/19 19:23:00 DEBUG : 4 go routines active
2023/11/19 19:23:00 Failed to copy: unknown --order-by comparison "ascending"
I'm making a backup of my GDrive content to multiple external HDDs.
As it's only a backup, I'm using rclone copy. I had to stop the transfer multiple times due to small adjustments. (I know --dry-run
exists)
I noticed that rclone created the folders that I wanted but as I cancelled the transfer and then run the same command again, rclone didn't start from beginning (checking what is already there and then continuing on) but instead, it ignored the empty folders and kept going.
I assume those skipped folders would be checked after everything will be copied but I have to split my copying to multiple HDDs so I was unable to verify this yet.
What I need is some flag that will tell rclone to check existing folders first, (if the files inside them are there or not) and start by ascending order.
I found --order-by
but I can't make it work for some reason. The --check-first
flag does what is supposed to but still skips the existing empty folders which were created during first runs.
The reason I need is that my transfers have to be split to multiple HDDs so the transfer will be interrupted multiple times. It will be hard to cherry-pick folders which are properly copied and which are not.
Does anyone know if this is possible?
Edit: I've read the --order-by documentation incorrectly. The correct format that I want is --order-by name,ascending
. In the end, I ended up using --order-by name,ascending --check-first
That also solves my question.