I was syncing directories by name on two drives by name like rclone /mnt/data/dir1 remote:dir1 I did this using a script. Now I want to clean this up and simplify it by syncing both drives with excluded directories. I have tried with something like rclone /mnt/data remote: --exclude dir2, but it wants to delete everything. I know the directories are named the same on the remote as they are in the local, so why wouldn't it just recognize they are the same? Is there any way to make it see them as the same?
I really think this is more of me using rclone wrong, so I do not think the config or logs will be helpful. I will post them if needed.
Run the command 'rclone version' and share the full output of the command.
- os/arch: linux/amd64
- go version: go1.13.6
Which cloud storage system are you using? (eg Google Drive)
The command you were trying to run (eg rclone copy /tmp remote:tmp)
I'm sorry. I meant that I want it to list the files it wants to create on the remote, so I can compare them to what is already there. Although I suspect the names are the same and something else is going on.
you already have that in your command, by using --dry-run
check the log output,
here rclone is showing that it would delete that file, but does not since --dry-run Skipped delete as --dry-run is set
rclone does not work well with duplicates, might need to run rclone dedupe
from the documentation
"Duplicated files cause problems with the syncing and you will see messages in the log about duplicates.
Use rclone dedupe to fix duplicated files"
I read that part in the logs, but it sounded like it was talking about files with the same name on the DESTINATION having nothing to do with the source.
Also, what makes the difference between a file that has already been synced (and needs to be updated or marked as already transferred) and a "duplicate" file? Is there some record keeping that I might be able to fiddle with?
OK, I tried dedupe interactive and it is saying that there really are duplicate objects of some files. That would screw things up for sure. I will let it run without interactive and then run the sync with dry-run again.
I will find a way to delete the duplicates without going insane or getting carpal tunnel.
The duplicates got there from the underlying problem I am trying to solve though. The first time I ran the sync with the root folder instead of the individual folders, it treated them as new files instead of "checking" them and seeing they are the same as the ones already there. I need to know how to force it to see those files as the same as the source files (since they have the same names after all).
OK all. I think I know what is happening. I have two drives that I am syncing to this remote. Before I was syncing each directory individually with no problems, but now when I sync the first drive it is trying to delete all the folders from the second drive as they don't exist on the first.
Is there a way to sync multiple source directories to remote in the same command?