Duplicate objects

Hi there,

i get multiple of the following warnings when copying a 1TB backup database/folder from GCloud drive to a local debian ext4 partition:

NOTICE: some/folder/somefile: Duplicate object found in source - ignoring

A few questions;

  1. What's a "duplicate object" in this case? Is it the exact same path+name, or can it also be a duplicate file(hash) in a different folder?

  2. What does the warning "ignoring" mean? Does it mean it copies the file anyway or does it skip copying the 'second/duplicate' file?

  3. Assuming that the answer to #1 is "same path+name", i could run rclone dedupe --dedupe-mode newest. However, the dedupe docs say:
    "dedupe considers files to be identical if they have the same hash."
    So that would mean running dedupe could result in deleting duplicate files that do NOT have the same path+name, but DO have the same file-hash?

My goal is to make as little modifications to the database structure as possible. So only in case of duplicate path+filename, i'd like to keep the newest date/version of the file.

Thanks in advance :slight_smile:

rclone v1.53.1

  • os/arch: linux/arm64
  • go version: go1.15

It is the exact path+name

It picks one and copies that. Which one it picks depends on which order they list in from google.


No, dedupe will only delete files with name path+name

dedupe newest sounds like it is what you want then :slight_smile: Try it with the -i flag first to make sure it is doing what you want.

Okay great, thank you for clarifying. Rclone is an amazing tool :ok_hand:t3:

PS. may i suggest considering changing this line in the docs:

dedupe considers files to be identical if they have the same hash.


dedupe considers files to be identical if they have the same file path and the same hash.


You can propose a change yourself very easily... Go to this page, click the pencil icon in the top right corner and edit away :slight_smile:

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.