Rclone sync with Duplicate object

Rclone version rclone v1.46

I’m trying to sync googledrive to webdav ( webdav flash account, there is no data on it and this is my first time to sync )

i’m using this command

-tpslimit 3 --transfers 3 --retries 80 --delete-before --checksum --drive-chunk-size 32M -vvvv sync

But i have found that there is a error with “Duplicate object found in source - ignoring”

i have also double check destination , there is no data on that directory, any idea how to fix it ?

Run rclone dedupe on the google drive to fix the duplicate file names is probably the best course of action.

Rclone has a bug where it evaluates a file and folder with the same name as a duplicate. Any ideas on how to stop this from happening? For example
/fileshare/00001 (folder)/ other_file_within_folder
/fileshare/00001 (file)

Uhm, how do you get into that scenario? That shouldn't be possible to have a directory and file named the same thing.

I'm not talking about a traditional file system. I'm talking about a cloud object store. the key is what makes it unique in an object store. This is allowed on AWS, GOOGLE, RAX, etc


/fileshare/db/001 (001 is the object within the db folder)
fileshare/db (object)

If you can start a new thread as that isn't related to this as the OP was talking Google Drive to a WebDAV.

Please include rclone version, logs with -vv on and an example command.