I'm running Rclone to backup (sync) all my files from a "source of truth" to a few different encrypted remotes (better safe than sorry, right?).
But right now I'm running into an issue: I want to backup the VMs backups that run daily on my server. Problem is that it's almost 1Tb of data, and some vdi files are bigger than 400Gb. That would eat up very quickly the quota that I have for some of those remotes and there would not be enough quota left to proceed with the rest of the backup of other important files.
That's why I have started looking into other options (incremental backup) and after almost couple of hours reading I came to the conclusion that Duplicacy is the best tool for this kind of job out there. Googling a bit more I found this old post about the suggestion of a possible collaboration between Duplicacy and Rclone and I thought that that would be great (Rclone serve for Duplicacy like restic API).
So my question is: is that still an idea? Is it on Rclone's roadmap? Are there any impediments to this? Can I do something to make it happen? I already donated quite some money so maybe I can use my brain power now (the little I have).
to backup virtual machines, i use veeam agent and veeam backup and replication to dedicated local backup servers, and then rclone copy --immutable to cloud.
one major feature is instant recovery, where if the server hosting the vm dies, i can run the vm direct from the backup files on the local server.
and if the entire building containing the vm server and backup server blows up.
from another location, rclone mount the backup files from cloud and run the vm that way.
Yeah, I'm aware. According to the author, it's a very stable project so it barely needs any updates. They are working on a big refactor now to make it less CPU demanding.