Skip uploading of file, if same path/file is currently uploading from another machine

What is the problem you are having with rclone?

rclone duplicates file, if uploaded from 2 different sources/machines at the same time. I would like if a file with the same path/name is currently uploaded, it should skip that file, if possible in anyway?

Run the command 'rclone version' and share the full output of the command.

Both machines are running exact same version:

rclone v1.58.1
- os/version: ubuntu 18.04 (64 bit)
- os/kernel: 4.15.0-180-generic (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.17.9
- go/linking: static
- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Encrypted google team-drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Machine 1

rclone copy /home/user/test.mkv mydrive_secret: --drive-stop-on-upload-limit -vv -P

Machine 2 (2 minutes after machine 1 is started)

rclone copy /home/user/test.mkv mydrive_secret: --drive-stop-on-upload-limit -vv -P

And when both are done i run

rclone ls mydrive_secret:

And the output is:

8852367382 test.mkv
8852367382 test.mkv

The rclone config contents with secrets removed.

type = drive
client_id = REMOVED
client_secret = REMOVED
scope = drive
token = {"access_token":"REMOVED","token_type":"Bearer","refresh_token":"REMOVED","expiry":"2022-06-10T16:03:59.683621041Z"}
team_drive = REMOVED
root_folder_id =

type = crypt
remote = mydrive:secret
password = REMOVED
password2 = REMOVED

A log from the command with the -vv flag

Both machines have similar log file:

2022/06/10 15:48:25 DEBUG : rclone: Version "v1.58.1" starting with parameters ["rclone" "copy" "/home/user/test.mkv" "mydrive_secret:" "--drive-stop-on-upload-limit" "-vv" "-P" "--log-file=rclone.log"]
2022/06/10 15:48:25 DEBUG : Creating backend with remote "/home/user/test.mkv"
2022/06/10 15:48:25 DEBUG : Using config file from "/home/user/.config/rclone/rclone.conf"
2022/06/10 15:48:25 DEBUG : fs cache: adding new entry for parent of "/home/user/test.mkv", "/home/user"
2022/06/10 15:48:25 DEBUG : Creating backend with remote "mydrive_secret:"
2022/06/10 15:48:25 DEBUG : Creating backend with remote "mydrive:secret"
2022/06/10 15:48:25 DEBUG : mydrive: detected overridden config - adding "{-LSY5}" suffix to name
2022/06/10 15:48:25 DEBUG : fs cache: renaming cache item "mydrive:secret" to be canonical "mydrive{-LSY5}:secret"
2022/06/10 15:48:26 DEBUG : test.mkv: Need to transfer - File not found at Destination
2022/06/10 15:48:26 DEBUG : REMOVED-HASH: Sending chunk 0 length 8388608
2022/06/10 15:48:27 DEBUG : REMOVED-HASH: Sending chunk 8388608 length 8388608
2022/06/10 16:06:56 DEBUG : REMOVED-HASH: Sending chunk 8849981440 length 4547206
2022/06/10 16:06:58 DEBUG : test.mkv: md5 = REMOVED-HASH OK
2022/06/10 16:06:58 INFO  : test.mkv: Copied (new)
2022/06/10 16:06:58 INFO  : 
Transferred:   	    8.246 GiB / 8.246 GiB, 100%, 6.984 MiB/s, ETA 0s
Transferred:            1 / 1, 100%
Elapsed time:     18m33.1s

2022/06/10 16:06:58 DEBUG : 5 go routines active

Technically, I would not say rclone did that as you did :slight_smile: Google Drive allows duplicates and if you are uploading two files with the same name at the same time, you'd get that.

Other than 'don't do that', I'm not sure what else you can do unless you keep track across multiple servers what is being uploaded and exclude it from the other.

Depending on how much you get the issue, you could script a dedupe to clean it up.

Damn, okay thank you for your reply

If I was to dedupe the files, how do I even delete only 1, leaving the other one? If I just "rclone delete mydrive_secret:test.mkv" it deletes both files

rclone dedupe

Would be the command. There a few options to it but I'd ensure you do a bit of testing and make sure it's doing what you want.

Important: Since this can cause data loss, test first with the --dry-run or the --interactive/-i flag.
1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.