I have a URL that has 8 directories. In Directory #1, there are 80 files. In Directory #2, there are 90 files. 80 files in Directory #2 are the exact same as 80 files in Directory #1. But also in Directory #4, 6 and 8. There are also duplicates present from other Directories in the URL. I only want one copy of a file that has a unique name. So, after it is downloaded the first time in any particular directory, any file of the same name should not be downloaded again.
With that said, will this command work?
rclone sync source: dest:
The sync is moving from data in the URL to an empty HDD on my computer.
A sync command makes source look like dest so that would not work. There isn't a native way in rclone to identify a unique file from Dir #1 and Dir #2.
I can't think of an easy way to do that as you'd have to keep a list of what are all the file names and do some scripting to keep them unique prior to copying with rclone and you'd use a files-from list.
I can make a list of all the names. Regarding scripting, do you mean Powershell? Or something else? Also, what type of script did you have in mind? I don't know how to script, mind you, but I can ask around.