"backup" command status?

I’ve been using Rclone for quite some time now, with multiple jobs running automatically and smoothly. I am very satisfied with the result! Thanks to Nick and the other 114 contributors!

Humbly, I think it still needs an option to clean up the old versions of the backups.

Something similar to the Duplicacy prune command:

$ duplicacy prune -keep 1:7       # Keep 1 snapshot per day for snapshots older than 7 days
$ duplicacy prune -keep 7:30      # Keep 1 snapshot every 7 days for snapshots older than 30 days
$ duplicacy prune -keep 30:180    # Keep 1 snapshot every 30 days for snapshots older than 180 days
$ duplicacy prune -keep 0:360     # Keep no snapshots older than 360 days


I saw a few times here in the forum the backup command being discussed. Is there an ETA for it?

Or even something simpler, similar to the prune command above?

I’m going to see if I can do the backup command for the next release…

The backup command will need some policy… What I’m planning is once you set it up it backs up to a directory structure like this

--- current
--- 2018-01-01 00:00:00
--- 2018-01-02 00:00:00

It will use --backup-dir to fill in the dated directories.

It will then have some pruning facility similar to duplicacy.

I’d also like to do a combined mode where you can look in the snapshots and see all the files, not just the ones that were overwritten on that day.

Is that anything like what you were thinking of?

1 Like

This would be a good approach!

I already use the --backup-dir in a similar way:

rclone sync local_folder remote:remote_folder  --backup-dir remote:remote_folder_versions/%backup_id%-%date%-%hour%

So I have a folder (named backup_id-date-hour) for the changed files in each backup job.

Yes! That’s the point.

I’m not sure if that would be a good logic. When I have a problematic file, I ask myself, “Did this file change yesterday? Or the day before yesterday?” So I go in the folder corresponding to the date and I see the changed ones. I don’t know if it would be nice to see all the files in this folder on the respective date. Would not that be confusing?

On the other hand, this approach may give you the option: “Let’s return this whole folder as it was five days ago”, which is very interesting for interrelated files (related databases, project files, etc.).

Other situations:
“I lost everything!” - Ok, just download the latest snapshot (the “— current” folder above);
“This file is corrupted!” Ok, let’s look for an uncorrupted version of the last few days.

(There is a “suboption” for the second: “I mistakenly deleted a file and only saw it now!” :smile:)

Duplicacy has an interesting approach that I think cover all situations above.

Let’s say you have a folder with backup in the 20th version (which Duplicacy calls “revision”).

If you want to return the entire folder as it was 5 days ago (15th version) do:

duplicacy restore -r15 .......

If you want to return only one file (or folder) to the 15th version, use an include pattern:

duplicacy restore -r15 myproject.mpx ........

So, if we are going to have a “backup” command, it will have to be made available simultaneously with its brother command, “restore”

It might be confusing… I would be reasonably tricky to implement too. I enjoyed using a similar feature in restic though which provided a read only FUSE mount which had all the files in each snapshot.

Thanks for your thoughts on restore - I hasn’t really thought too much about that! Though I think the FUSE mount might be the most convenient way of restoring stuff.

This would be a very good, and necessary feature if you would like to restore all files from a previous date. Especially handy for restoring backups after eg ransomware attacks.
Microsoft (and probably the others too) also believe in this feature: https://fossbytes.com/onedrive-files-restore-recover-deleted-files/