rclone copy…new files added since the last rclone run in books will be copied to google drive
rclone sync --min-age 14d …rclone will then sync the same folder. Because all present source files have already been transferred to the destination, any files on the destination older than 14 days that are no longer on the source will be deleted from the destination
Is this correct? If not, is there a way to get this behaviour? I’m trying to find a way to backup all current files, but give me 14 days or so to retrieve files accidently deleted or lost e.g. to a drive failure
What I would do is use rclone sync with --backup-dir in step 1, then in step 2 use rclone delete --min-age 14d on the backup dir to delete any files older than 14 days.
This will mean that you’ll get backups of deleted files and overwritten files too and keep them for 14 days.
I tried duplicati in a docker and it was throwing up errors at the end of backup jobs I couldn’t get to the bottom off, so it made me twitchy.
I’ve come across Arq before - that’ll be my go-to if I can’t get the free option to work. To be honest, rclone is working fine so far - just curious about versioning. I think I’ll create a test myself and see.
duplicati is a beta product so I would not put production like data there. I use that as an additional backup in case my house catches fire and if I did lose it, it would be just annoying.
I’m using the following script to manage this. (Based on something I found online.) It creates a new backup-dir each time the script is run. I still need to add a delete call based on min-age.
I've been using this successfully for a while now but I'm hitting a problem where I've exceeded the teamdrive file limit. Is there any way to delete files older than xx days/months automatically from --backup-dir?
One useful thing to note about --backup-dir is that it is pretty easy to set it up with a script so that you can organize your "deleted" archive by date.
That both allows you to go easily find files from a spesific time - and also it allows you to keep multiple revisions of the same file (when the backup-dir is all in the same folder it will only keep one).
I wrote a little script for another user (on Windows/batch) that shows how this is done. Let me know if that is something you'd be interested in.