Can I use min-age to retain backups for xx days?

I’m setting up my first backup job. If I do the following two commands in this order:

rclone copy /mnt/user/books gdrivebackupupload:books
rclone sync --min-age 14d /mnt/user/books gdrivebackupupload:books

will this be the result?

  1. rclone copy…new files added since the last rclone run in books will be copied to google drive
  2. rclone sync --min-age 14d …rclone will then sync the same folder. Because all present source files have already been transferred to the destination, any files on the destination older than 14 days that are no longer on the source will be deleted from the destination

Is this correct? If not, is there a way to get this behaviour? I’m trying to find a way to backup all current files, but give me 14 days or so to retrieve files accidently deleted or lost e.g. to a drive failure

Thanks in advance

What I would do is use rclone sync with --backup-dir in step 1, then in step 2 use rclone delete --min-age 14d on the backup dir to delete any files older than 14 days.

This will mean that you’ll get backups of deleted files and overwritten files too and keep them for 14 days.

1 Like

Thanks that works much better and I’ve just seen in the documentation that I can use the folder name or suffix to do versioning.

Thanks again

does this look right?

rclone sync /mnt/user/backup gdrivebackupupload:backup --backup-dir gdrivebackupupload:old --bwlimit 100k
rclone sync /mnt/user/books gdrivebackupupload:books --backup-dir gdrivebackupupload:old --bwlimit 30k
rclone delete --min-age 28d gdrivebackupupload:old --bwlimit 30k

Yes that looks good to me! Test first with --dry-run and take a look at the logs.

Thanks will do this to make sure

Does this create versions? e.g. if file.txt already exists in gdrivebackupupload:old does it create file(1).txt or something similar?

Thanks

I personally use duplicati on my Linux box and Arq on my mac that both backup to my GD.

https://www.duplicati.com/ is free but a bit touchy at times. Arq works better but can only be used on 1 GD.

I tried duplicati in a docker and it was throwing up errors at the end of backup jobs I couldn’t get to the bottom off, so it made me twitchy.

I’ve come across Arq before - that’ll be my go-to if I can’t get the free option to work. To be honest, rclone is working fine so far - just curious about versioning. I think I’ll create a test myself and see.

duplicati is a beta product so I would not put production like data there. I use that as an additional backup in case my house catches fire :slight_smile: and if I did lose it, it would be just annoying.

I’m using the following script to manage this. (Based on something I found online.) It creates a new backup-dir each time the script is run. I still need to add a delete call based on min-age.

#!/bin/sh

DATE=date +%Y_%m_%d_%H_%M_%S
LOG_NAME=rclone_$DATE.log

if pidof -o %PPID -x “$0” >/dev/null; then

echo “rclone running; aborting crontab backup…” >> /home/reid/rclone_logs/$LOG_NAME

else

ionice -c 3 rclone -v -v --transfers=4 --checkers=8 --exclude-from /home/reid/bin/backup-rclone-excludes.txt --backup-dir=pcloud:rclone_root/old_$DATE --log-file=/home/reid/rclone_logs/$LOG_NAME sync /home/reid pcloud:rclone_root/current_backup

fi

1 Like

Hi

I've been using this successfully for a while now but I'm hitting a problem where I've exceeded the teamdrive file limit. Is there any way to delete files older than xx days/months automatically from --backup-dir?

Thanks

Didn't he already answer that?

That should do exactly what you are asking for.

duh so Nick did and I had it in my script hidden at the bottom. I've reduced the number of days to have a more aggressive delete policy

One useful thing to note about --backup-dir is that it is pretty easy to set it up with a script so that you can organize your "deleted" archive by date.

That both allows you to go easily find files from a spesific time - and also it allows you to keep multiple revisions of the same file (when the backup-dir is all in the same folder it will only keep one).

I wrote a little script for another user (on Windows/batch) that shows how this is done. Let me know if that is something you'd be interested in.