I wasn't sure what to tag this question with, it's not strictly a help and support question per say.
I was just wondering is there a better way of copying/moving a large number of single files in multiple locations to multiple locations, that doesn't involve a rclone copy or rclone move command for each individual file. That works but it's obviously slow which I don't particularly mind, just was wondering if there was a better faster way.
Also I didn't want to use rclone rcd just purely because when I did a test run moving one file which just happened to exist in the location, although I wasn't sure if the hashes matched, rather creating a duplicate as a rclone move command does, it instead just deleted the file and didn't create a duplicate which I would have preferred, so I could then do an interactive dedupe later.
@ncw I wish the Remote Control / API for operations/movefile worked identically to rclone move in that it creates duplicates if the file exists in the destination rather than just deleting it.
Yes google drive. Ah ok I will just leave it as multiple copy commands then.
The problem with that is there is no --dry-run for operations/movefile so everytime I do a test and a file with the same name exists it deletes the file on the remote that was to be moved. If there was a --dry-run option it would make it easier to provide an example.
Sadly not able to do that as the files are in google drive, and I am wanting to move several files from several locations, but not all the files in each of the locations to multiple different locations.
I have done that in the past, but unfortunately that wouldn't work this time as all the files in the --files-from would be moved to the same folder? which isn't what I'm wanting.
On a side note it would be great idea maybe if there was a --files-from and a --files-to they would both have the same number of lines in each of the files, perhaps you could check this maybe? and do the copies/moves?