- rclone copy never deletes any files from the destination
- rclone sync deletes all files on destination that do not exist on the source
Here’s a problem.
- If I use copy, my clients 40GB Virtual Machine (living inside a folder) is being backed up every single day creating a huge online backup very quickly ie; 20 VM’s and counting…
- If I use sync, it leaves them vulnerable to encryption viruses. For instance if they go home at 5:00PM and an ecryption virus is encrypting files for 5 hours and then our Offsite backup ‘syncs’ at 2:00AM, their offsite backup is toast !!
So my question is this. Is there a way to use the sync command to a specific target directory for instance Folder_1
In this way I can sync to Folder_2 on the second day and back to Folder_1 on the 3rd. This will insure we always have a 24 hour buffer on encryption viruses yet still get every day backed up.
I am using Backblaze if this matters in any way but I suspect not.
What sayeth the group ?
Use something like --backup-dir sameremote:/someplace/
That is if your remote supports moving files.
Every changed or removed file is moved to backup-dir, you basically have an incremental backup, scalable, very easy to organize and to prune if needed.
backblaze doesn’t support moving files so I wouldn’t use --backup_dir there but it is a good suggestion elsewhere.
The only way I can think to do it is to, as part of your bash script, have different targets per day of week or something and rotate through by taking the date of week on the end of the target dir.
That being said, backblaze supports versioning if you want to let backblaze handle it implicately. Within the GUI you can specify a retention to all files backed up and then you don’t need to change anything. Then if you had to recover, rclone can see the older versions if you tell it to look at them.
$ rclone -q ls b2:cleanup-test
$ rclone -q --b2-versions ls b2:cleanup-test