Force copy duplicate in google drive

What is the problem you are having with rclone?

i got Duplicate directory found in source - ignoring error in google drive, how i can force copy the duplicate?
the source are read only.

Run the command 'rclone version' and share the full output of the command.

rclone v1.58.1

  • os/version: ubuntu 18.04 (64 bit)
  • os/kernel: 5.4.188+ (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.17.9
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

google drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy work:Megalink colab:abc/Megalink -v --drive-copy-shortcut-content --drive-skip-dangling-shortcuts --drive-skip-gdocs --disable copy

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

2022/06/24 07:11:55 NOTICE: P: Duplicate directory found in source - ignoring
2022/06/24 07:12:16 NOTICE: Hihi/20201216_215605.jpg: Duplicate object found in source - ignoring
2022/06/24 07:12:16 NOTICE: Hihi/20201219_215604.jpg: Duplicate object found in source - ignoring
2022/06/24 07:12:16 NOTICE: Hihi/20210112_130241.jpg: Duplicate object found in source - ignoring
2022/06/24 07:12:16 NOTICE: Hihi/20210112_130242.jpg: Duplicate object found in source - ignoring
2022/06/24 07:12:16 NOTICE: Hihi/20210304_115252.jpg: Duplicate object found in source - ignoring
2022/06/24 07:12:16 NOTICE: Hihi/20210304_115252.jpg: Duplicate object found in source - ignoring

When have you duplicates, it's a bit random based on what returns on what is being copied. There isn't a way I'm aware of to force a copy. Best bet is to clean up the source or ask the owner to clean up the source.

If you can find the ID of the duplicate directory you can use it as root_folder_id in with --drive-root-folder-id to copy it directly.

Otherwise rclone dedupe the source is recommended