Duplicate object found in source - ignoring

I get these errors when doing a command like rclone --bwlimit 8.5M --tpslimit 1 --low-level-retries 20 sync remote1_crypt: remote2_crypt: on google drive

I get a few of these errors Duplicate object found in source - ignoring

these errors seem to be preventing the sync to run the delete sequence. These duplicates in my data are on purpose. Is these a flag for rclone go ahead and sync the duplicates?

thanks

What that means is that you have two files with the identical name and path. This isnā€™t possible on a normal filing system but it is on google drive.

rclone has a tool for finding and resolving these problems - rclone dedupe

1 Like

trying to get to the bottom of this.

when running the sync command, it shows me similarly named directories and an encrypted string as duplicated. i suppose this means that each one is duped, not that rclone is mistaking them for being dupes?

when I run rclone dedupe remote1_crypt:

i get the error: Failed to dedupe: Encrypted drive remote1_crypt: can't merge directories

The detection of a dupe in sync is just that the names are identical.

That is a bug - it means the crypt remote isnā€™t passing on the merge request to the underlying remote properly.
I made an issue about that https://github.com/ncw/rclone/issues/2233

However you can dedupe the underlying remote and it should get rid of that duplicated directory for you. Donā€™t rename any of the duplicates though as they will become invisible in the crypted remote.

Old thread, but Iā€™m seeing this when I rclone copy to a WebDav remote. I did the first copy as a dry-run, and it listed the files which have changed. All looked good, so I removed --dry-run, and ran the copy. About 4GB and several hundred files were copied; all good.

I then re-ran the copy, just to sanity-check - I assumed Iā€™d get no changes. However, instead, I got hundreds of files listed all with this error:

Duplicate object found in destination - ignoring

When I go onto the remote host and ls the folder, there are no dupes. If I do rclone ls on the remote folder, I see no dupes. So what is going on?!

The WebDav remote is a Synology NAS running WebDAV server, and Iā€™m running rclone 1.43 for Mac, if that makes a difference.

weird! Sounds like it is a glitch with the WebDAV interfaceā€¦

Is this repeatable? If so running with -vv --dump responses may show some light on the matter. I think it might show duplicated files in the listing maybe.

Hi Nick,

I donā€™t know if itā€™s reproducible - it doesnā€™t seem to happen in the same way on my home network (where I am today). The errors I was getting yesterday were when I ran it over my VPN from a remote connection.

Iā€™ll try it again tomorrow when Iā€™m in the office. I sent you a log anyway, just in case it shows anything up. :slight_smile:

Thanks for the logā€¦

It looks like you do have a duplicate directory (kind of).

Here are the duplicates from the XML listing.

  35688:<D:href>/photo/New%20Orchids%20Gro%c3%9fra%cc%88schener%201-Feb-2017/</D:href>
  35712:<D:href>/photo/New%20Orchids%20Gro%c3%9fr%c3%a4schener%201-Feb-2017/</D:href>

If I decode those strings they look like this

>>> print urllib.unquote("photo/New%20Orchids%20Gro%c3%9fra%cc%88schener%201-Feb-2017/").decode("utf-8")
photo/New Orchids GroƟraĢˆschener 1-Feb-2017/
>>> print urllib.unquote("/photo/New%20Orchids%20Gro%c3%9fr%c3%a4schener%201-Feb-2017/").decode("utf-8")
/photo/New Orchids GroƟrƤschener 1-Feb-2017/

So visually identical

However they are different unicode strings

>>> urllib.unquote("photo/New%20Orchids%20Gro%c3%9fra%cc%88schener%201-Feb-2017/").decode("utf-8")
u'photo/New Orchids Gro\xdfra\u0308schener 1-Feb-2017/'
>>> urllib.unquote("/photo/New%20Orchids%20Gro%c3%9fr%c3%a4schener%201-Feb-2017/").decode("utf-8")
u'/photo/New Orchids Gro\xdfr\xe4schener 1-Feb-2017/'

However rclone does something called unicode normalisation on its strings so after unicode normalisation those strings are identical (which is what the user would expect since they visually look identical).

You might ask how these strings came into being? My guess is that these files have been on macOS at some point which stores denormalised strings in its file system by default and some tools (including older versions of rclone) donā€™t normalise the strings again.

I think the solution to this is to merge the contents of the directories.

Thanks Nick.

Actually, I knew about that one, and itā€™s less of a surprise, since unicode always screws things up. :wink: What was confusing me was a bunch of new folders which claimed to have dupe files in - despite them clearly not (and no unicode in sight).

But sadly (or, I suppose, happily) I canā€™t repro it now, so it must have been a one-off. Thanks for looking though!

Ha! One of these days Iā€™m going to make that ā€œI hate unicodeā€ t-shirt I keep threatening my co-workers with :wink:

Let me know if it happens again!

1 Like

Maybe have ā€œI hate timezonesā€ on the back too. :wink:

2 Likes