GDrive and Duplicate Files

I’ve been using rclone to move files from my server to GDrive. It’s been working great. Because I have various versions of the same file, I’ve thought of using the --ignore-existing trigger so I can manually rename the files after a move, so I can keep the various versions of the file (i.e. file.txt to file(1).txt), and then run the move command again, but it ends up deleting the source file. I do not want to replace the existing file on the destination.

Is it possible, or I’m just not seeing the option, when using move or copy if there is an existing file at the destination with the same file name, to rename the new file with a suffix, such as (1)?

P.S. Does --size-only mean that when transferring it keeps the larger file between the source and the destination?

rclone doesn’t work very well with duplicate files - it will be making warnings in the log about it (if not please upgrade to a later rclone from )

So I recommend you remove the duplicates using rclone dedupe

You can use the –backup-dir to save old versions elsewhere. You can use the --suffix command to rename them differently. You can’t save them in the same directory though as that will confuse the sync.

No, it transfers the file if the sizes are different - it doesn’t check larger or smaller. It doesn’t check the checksum or modification time.

Is there a way, when using move to keep the larger of the files? For example, source file is 100 MB, and the file that is already existing on the destination is 200 MB, it will keep the file that is already at the destination and delete the smaller file?

Otherwise, can I use --backup-dir like this?
rclone move /Path/to/source/ remote:A --backup-dir remote:A/Dupes

Or does it have to be like this:
rclone move /Path/to/source/ remote:A --backup-dir remote:Dupes

If I can do it like the first example could I use dedupe like this:
rclone dedupe --dedupe-mode largest remote:A

Will it compare the files in remote:A and remote:A/Dupes against each other?

P.S. I thought Google Drive can have duplicate file names, but that doesn’t seem to be the case.

Not currently no.

Can you rely on the timestamps of the files? If so then you could use

-u, --update                              Skip files that are newer on the destination.

I think both of those will work.

No it is only looking for files in the same directory to compare.

It can, but rclone tries not to make them!

Is it possible to force/tell rclone to make the duplicate files so I can then run --dedupe on the directory?

P.S. Thanks for all the help.

That would need some code changes - rclone does its best not to make duplicates at the moment.