Backup Versioning

Hello,

What is the problem you are having with rclone?

How Can i Versioning my Backups?

For Example a Initial Upload and than Incremental or so for the changed Data. With Versioning i can Access to older Stands of the Data like Daily, Weekly, Monthly ....

Run the command 'rclone version' and share the full output of the command.

Are you on the latest version of rclone? You can validate by checking the version listed here: Rclone downloads
--> ```
/data # rclone version
rclone v1.62.2

  • os/version: alpine 3.17.2 (64 bit)
  • os/kernel: 5.19.17-Unraid (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.20.2
  • go/linking: static
  • go/tags: none
    /data #

####  Which cloud storage system are you using? (eg Google Drive)

A Nextcloud with SFTP Access.

Many Thanks!

Greetings

Revan335

rclone is not a backup program, but can stich together a bit of script

rclone sync ~/source remote:files/current --backup-dir=remote:files/archive/`date +%Y%m%d.%I%M%S`

is that a nextcloud feature, as rclone does not have native support for versioning.
tho can use --min-age and -max-age

I proposed a wrapper remote that does this but I can't develop it in Golang.

But, I wrote a backup tool that does exactly this: dfb (dated file backup).

It stores files like: <filename>.YYYYMMDDhhssmm<optional R or D>.ext and recored deletes as simple records.

I keep meaning to write an rclone backup tool which does smething like your tool.

BTW rclone will be doing atomic transfers to local, ftp, sftp in v1.63. This is upload to tempname and rename to actual name when complete.

Nice. My only concern, and it is the same that led me to write rirb, is slow listing of remotes. But honestly, I think I'd take 1st party over my wrapper if given the chance (unlike prior efforts, of which I have many, I used the rclone API for this one which makes it closer to speed parity but still, ultimately, a wrapper)

Great! That is two features I can pull out once 1.63 is out (the other being symlinks with copyto that we discussed and fixed earlier in the cycle)

Atomic local also will nullify #6206 (though not really fix it).

That, plus the serve webdav modtime support is making me very excited for the next release. Lots of quality of life improvements for me!

Yes, I'd have a local database to speed things up with the option to rebuild it from the local and remote with slow listings.

I want to be able to serialize objects to disk which would make this much easier but that's something I keep not getting round to.

I just tested this and it doesn't work with crypt wrapping local (and presumably (S)FTP). That is unfortunate but I guess I'll leave that in dfb.

Great!

:frowning:

This was an oversight but it is easily fixed.

Please give this a go

v1.63.0-beta.7069.7d23cbc32.fix-partial-uploads-wrap on branch fix-partial-uploads-wrap (uploaded in 15-30 mins)

1 Like

That works! Thanks (as always!!!)

Thanks for testing.

I've merged this to master now which means it will be in the latest beta in 15-30 minutes and released in v1.63

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.