How do I change to root folder sync from folder by folder sync?

What is the problem you are having with rclone?

I was syncing directories by name on two drives by name like rclone /mnt/data/dir1 remote:dir1 I did this using a script. Now I want to clean this up and simplify it by syncing both drives with excluded directories. I have tried with something like rclone /mnt/data remote: --exclude dir2, but it wants to delete everything. I know the directories are named the same on the remote as they are in the local, so why wouldn't it just recognize they are the same? Is there any way to make it see them as the same?

I really think this is more of me using rclone wrong, so I do not think the config or logs will be helpful. I will post them if needed.

Run the command 'rclone version' and share the full output of the command.

rclone v1.50.2
- os/arch: linux/amd64
- go version: go1.13.6

Which cloud storage system are you using? (eg Google Drive)

Google drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

stated above

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

Paste  log here

hello and welcome to the forum,

the version of rclone is many years old,
please update to latest stable v1.58.0

help us to help you and provide answers to all the questions.

I upgraded and same problem. I am using a crypt remote if that makes any difference.

I really do not want to upload a log with all of my files listed. What other options are there?

I don't know.

You have not listed what command you are running in detail.
No rclone.conf
No log file.

How do you suggest we help with no details?

Let me see if I understand...

You are doing this

rclone sync /mnt/data/dir1 remote:dir1
rclone sync /mnt/data/dir2 remote:dir2

But you'd like to do this with a single rclone command?

rclone sync /mnt/data/ remote: + magic flags

Something like this is the equivalent of the two rclone commands

rclone sync /mnt/data/ --include "/dir1/**` --include "/dir2/**" remote:

Test first with --dry-run

Is there a way to just list the files it is going to copy with paths and not all the deletions?

Added some info to original post. I'm still leary of posting logs though.

I can post the top of the log here as it seems to have disabled editing on my original post.

022/03/29 05:35:56 DEBUG : rclone: Version "v1.58.0" starting with parameters ["rclone" "sync" "--dry-run" "--delete-during" "-vv" "--drive-chunk-size=256M" "--retries" "2" "--retries-sleep" "10m" "-v" "--log-file" "/tmp/rclone-cron.log" "/mnt/media/" "phire-crypt:" "--exclude" "/hbdata/" "--exclude" "/Audio/" "--exclude" "/TMBackup/" "--exclude" "/nextcloud/" "--exclude" "feature length/" "--exclude" ".Trash-1000/" "--exclude" ".Trash-999/" "--exclude" "/whonix/"]
2022/03/29 05:35:56 DEBUG : Creating backend with remote "/mnt/media/"
2022/03/29 05:35:56 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2022/03/29 05:35:56 DEBUG : Creating backend with remote "phire-crypt:"
2022/03/29 05:35:56 DEBUG : Creating backend with remote "ntpu-phire:secure"
2022/03/29 05:35:56 DEBUG : ntpu-phire: detected overridden config - adding "{A6J6b}" suffix to name
2022/03/29 05:35:56 DEBUG : fs cache: renaming cache item "ntpu-phire:secure" to be canonical "ntpu-phire{A6J6b}:secure"
2022/03/29 05:35:56 DEBUG : Waiting for deletions to finish
2022/03/29 05:35:56 DEBUG : .Trash-999: Excluded
2022/03/29 05:35:56 DEBUG : Audio: Excluded
2022/03/29 05:35:56 DEBUG : TMBackup: Excluded
2022/03/29 05:35:56 DEBUG : hbdata: Excluded
2022/03/29 05:35:56 DEBUG : nextcloud: Excluded
2022/03/29 05:35:56 DEBUG : whonix: Excluded
2022/03/29 05:35:56 DEBUG : .Trash-1000: Excluded
2022/03/29 05:35:56 DEBUG : whonix: Excluded

Also, I noticed two interesting lines

2022/03/29 05:35:57 NOTICE: Documents/MMS instructions.docx: Duplicate object found in destination - ignoring

and then a little later it tries to delete it anyway?

2022/03/29 05:35:57 NOTICE: Documents/MMS instructions.docx: Skipped delete as --dry-run is set (size 8.902Ki)

rcone ls will just list files without copy/delete.

I'm sorry. I meant that I want it to list the files it wants to create on the remote, so I can compare them to what is already there. Although I suspect the names are the same and something else is going on.

you already have that in your command, by using --dry-run
check the log output,
here rclone is showing that it would delete that file, but does not since --dry-run
Skipped delete as --dry-run is set

rclone does not work well with duplicates, might need to run rclone dedupe
from the documentation
"Duplicated files cause problems with the syncing and you will see messages in the log about duplicates.
Use rclone dedupe to fix duplicated files"

I read that part in the logs, but it sounded like it was talking about files with the same name on the DESTINATION having nothing to do with the source.

Also, what makes the difference between a file that has already been synced (and needs to be updated or marked as already transferred) and a "duplicate" file? Is there some record keeping that I might be able to fiddle with?

OK, I tried dedupe interactive and it is saying that there really are duplicate objects of some files. That would screw things up for sure. I will let it run without interactive and then run the sync with dry-run again.

rclone sync will delete files in the dest, if the corresponding file does not exist in the source.

gdrive allows for two or more files with the same filenames, whereas local file system do not.
this is a problem for rclone and should clean up the duplicate files in gdrive using rclone dedupe

I will find a way to delete the duplicates without going insane or getting carpal tunnel.

The duplicates got there from the underlying problem I am trying to solve though. The first time I ran the sync with the root folder instead of the individual folders, it treated them as new files instead of "checking" them and seeing they are the same as the ones already there. I need to know how to force it to see those files as the same as the source files (since they have the same names after all).

rclone dedupe

Oh sorry, I meant because it was interactive and I thought I had a lot of dupes. I found that it has an automatic setting though.

Thanks for your help.

dedupe did not fix the issue.

fix which issue?

OK all. I think I know what is happening. I have two drives that I am syncing to this remote. Before I was syncing each directory individually with no problems, but now when I sync the first drive it is trying to delete all the folders from the second drive as they don't exist on the first.

Is there a way to sync multiple source directories to remote in the same command?

--include or --include-from

I was able to get it working by moving the folders into folders named after each drive and syncing each drive to its folder.

Thanks for all the help.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.