Sync with --backup-dir output strangeness

What is the problem you are having with rclone?

I do not understand the output behavior while using sync with the --backup-dir flag.

  1. Made a diretory RemoveAfterSync
  2. Placed a file in it test so that it would not be empty and get included in the sync
  3. Ran rclone sync /Volumes/Data /Volumes/zeit/twh-data --backup-dir=/Volumes/zeit/Archive/$(date +%m-%d-%Y) --ignore-errors --ignore-checksum -P
  4. First full sync of 15G took about 40 minutes
  5. Removed the RemoveAfterSync diretory from the source /Volumes/Data
  6. Ran the sync command again.
  7. This time it took only 4 minutes
  8. The Archive directory and the dated sub-directory were created.
  9. The RemoveAfterSync directory is now in the Archive/04-25-20/ directory
  10. BUT so are several others:
    GoogleDrive
    iTerm2-Color-Schemes
    Nextcloud
    OneDrive
    RemoveAfterSync
    

No Changes or Deletions were made in these others.

What is your rclone version (output from rclone version)

rclone v1.51.0

Which OS you are using and how many bits (eg Windows 7, 64 bit)

macOS 10.15.4

Which cloud storage system are you using? (eg Google Drive)

local to local, no remote

The command you were trying to run (eg

rclone sync /Volumes/Data /Volumes/zeit/twh-data --backup-dir=/Volumes/zeit/Archive/$(date +%m-%d-%Y) --ignore-errors --ignore-checksum -P

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp)

1 Like

that is expected behaviour using sync and --backup-dir.

if you remove RemoveAfterSync from /Volumes/Data,
then re-run the sync
rclone will move RemoveAfterSync from /Volumes/zeit/twh-data to /Volumes/zeit/twh-data/Archive/04-25-20/

I was expecting, from my understanding that it would be but not the others listed below it.

Thanks

1 Like

in those other folders, in the /Volumes/Data, are you sure that no files changed?
what, exactly, is in those folders under Archive/04-25-20/?

when testing sync, which can lead to the loss of files, you should use --dry-run, a log file with debug info and remove --ignore-errors

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp )

Yes there must have been some changes in those files previously because now I can get it to run.

Is there a specific order to how one runs sync with so many flags and arguments. I notice if I get a white space it will complain about too many arguments.
I added in a filter file and now the command has grown:

rclone sync /Volumes/Data/ /Volumes/zeit/twh-data --filter-from rc-data-filters --backup-dir=/Volumes/zeit/Archive/$(date +%m-%d-%Y)

I went down this rabbit hole after reading in another post your suggestion to use this as a "forever forward incremental backup". I was looking for a good way to use rclone for backups.
This seems like a good way to make a backup and an archive of diffs.

glad you found my other post about forever forward incremental backup

"it will complain about too many arguments."
you need to post the exact command that is causing the problem and the exact error message.

Thanks for the quick reply, I'm going to write off the too many arguments issue to copy and paste sloppiness and white space. I see now in VS code that with word wrap on it creates a white space and the end of the line.

What are the pros/cons of using copy vs sync in this scenario?

Thanks

you seem to want to use rclone for local to local, something i would not do.

copy and sync are very different and it depends on what you want from rclone.

What would be your preferred way to backup locally? I'm mostly using it now to learn the ins and outs of rclone. Would just rsync to local be a better option? I'm on both macOS and Linux platforms.

cheers,

well, this is what i do, but i am a windows user.

one of the many issues with local backups using rclone is how it deals with open files and file permissions.

i only use rclone to get other backup files to the cloud.

i have a 350+ line of python code to control all my various backups.

on my local computer, i use veeam agent, for a bare metal type of backup, only backups up the parts of files that change between backups. and if my entire computer died, i can recover it using those veeam backups.
veeam produces one file per day and that file ends up on my local home server.
so veeam is like rsync on steroids.

then for certain folders on my local computer, that script will create a 7zip file, password protected, and that will end up on my home server.

that script also runs on the server and copies those veeam backup file and 7zip to cloud using rclone.

i also use that script at work as i use it at home.

Gotcha, So my take away from what your saying here: rclone is best suited to the "Second Phase" of backup - moving files to the cloud for the offsite portion of ones backup and NOT the first line of snapshot backups.

Thanks

yes, for me, that is correct.

for you, for others, maybe that is not the way to go.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.