I get these errors when doing a command like rclone --bwlimit 8.5M --tpslimit 1 --low-level-retries 20 sync remote1_crypt: remote2_crypt: on google drive
I get a few of these errors Duplicate object found in source - ignoring
these errors seem to be preventing the sync to run the delete sequence. These duplicates in my data are on purpose. Is these a flag for rclone go ahead and sync the duplicates?
What that means is that you have two files with the identical name and path. This isnāt possible on a normal filing system but it is on google drive.
rclone has a tool for finding and resolving these problems - rclone dedupe
when running the sync command, it shows me similarly named directories and an encrypted string as duplicated. i suppose this means that each one is duped, not that rclone is mistaking them for being dupes?
when I run rclone dedupe remote1_crypt:
i get the error: Failed to dedupe: Encrypted drive remote1_crypt: can't merge directories
The detection of a dupe in sync is just that the names are identical.
That is a bug - it means the crypt remote isnāt passing on the merge request to the underlying remote properly.
I made an issue about that https://github.com/ncw/rclone/issues/2233
However you can dedupe the underlying remote and it should get rid of that duplicated directory for you. Donāt rename any of the duplicates though as they will become invisible in the crypted remote.
Old thread, but Iām seeing this when I rclone copy to a WebDav remote. I did the first copy as a dry-run, and it listed the files which have changed. All looked good, so I removed --dry-run, and ran the copy. About 4GB and several hundred files were copied; all good.
I then re-ran the copy, just to sanity-check - I assumed Iād get no changes. However, instead, I got hundreds of files listed all with this error:
Duplicate object found in destination - ignoring
When I go onto the remote host and ls the folder, there are no dupes. If I do rclone ls on the remote folder, I see no dupes. So what is going on?!
The WebDav remote is a Synology NAS running WebDAV server, and Iām running rclone 1.43 for Mac, if that makes a difference.
weird! Sounds like it is a glitch with the WebDAV interfaceā¦
Is this repeatable? If so running with -vv --dump responses may show some light on the matter. I think it might show duplicated files in the listing maybe.
I donāt know if itās reproducible - it doesnāt seem to happen in the same way on my home network (where I am today). The errors I was getting yesterday were when I ran it over my VPN from a remote connection.
Iāll try it again tomorrow when Iām in the office. I sent you a log anyway, just in case it shows anything up.
However rclone does something called unicode normalisation on its strings so after unicode normalisation those strings are identical (which is what the user would expect since they visually look identical).
You might ask how these strings came into being? My guess is that these files have been on macOS at some point which stores denormalised strings in its file system by default and some tools (including older versions of rclone) donāt normalise the strings again.
I think the solution to this is to merge the contents of the directories.
Actually, I knew about that one, and itās less of a surprise, since unicode always screws things up. What was confusing me was a bunch of new folders which claimed to have dupe files in - despite them clearly not (and no unicode in sight).
But sadly (or, I suppose, happily) I canāt repro it now, so it must have been a one-off. Thanks for looking though!