How to backup an offsite server?

I’m new to Rclone, so please be gentle. I don’t have the programming background that many here seem to possess.

I’m using Rclone, Rclone Browser and Backblaze B2, primarily to backup/sync my Ubuntu desktop–so far about 2TB. I do have some problems with all that, but I’ll save those for another day. Backblaze support is not as helpful as I had hoped.

My immediate issue is the need to backup our offsite server (running Debian Jessie) used to run a number of web sites. I do have familiarity with Rsync as previously we used that to backup the server to drives on my desktop. But that has had intermittent hiccups and it’s time to join the cloud world.

I gather with an offsite server, I’ll need to run Rclone as CLI, not with the browser. That’s OK as I presume I’ll need to set up cron jobs to have it run in the background. I would write up some crontabs that sequentially invoked Rsync for various directories on the system (and some needed tarring beforehand). It seemed presumptuous to hope to find that one command that would backup an entire drive or entire system.

So I looked around here and so far I haven’t seen any discussion or tutorial on how to run Rclone on a remote server. I would appreciate any links to any such discussions if they exist. Absent that, I’d appreciate any suggested rclone commands (and especially examples) to get this underway.

Yes that is correct

You can use one rclone command to back lots of things up. I’d use the –include-from flag (or maybe the –filter-from flag to set up a list of things you want in and out of the backup, then run rclone from the root of the file system.

So

rclone sync --include-from backup-dirs / remote:backup

For backups, also check out the –backup-dir flag

Probably the most complicated part is the initial setup of the config file. The easiest way is to set up the config file on your local computer then find it with rclone config file and copy it to the remote server.

NCW, thank you for your reply.

Since I wrote my original post, I’ve been experimenting. As I said, I have some familiarity with Rsync. In the past I would rsync sections from my desktop to an otherwise unused section of our server, and in return Rsync sections of the server back to the local desktop. This seemed to work, but required some forethought and effort.

I would tar/gzip up the /var/lib/mysql sections of the server, and the various logs. I would also tar/gzip up /etc. For some reason, all the files under /home would transfer OK, but under /etc, /root, and /boot - things often went wrong. I don’t know if it had to do with permissions or weird file names (symlinks didn’t transfer, but they didn’t stop the process). So that’s why I tar/gzipped things when needed. In most cases I set the tarring up with a crontab that would occur before the Rsync commands in their own crontabs. But of course I was looking for a solution that could be done with one step…

So that’s sort of the approach I’ve been looking at. So far I’m not tarring.

My first attempt failed:

 rclone sync /etc remoteServer:MyB2Bucket

With this I wouldn’t get the /etc directory and just all the files from /etc all dumped together.

So I tried (also creating the “topdirectory”):

 rclone sync --verbose --dump filters /etc remoteServer:MyB2Bucket/topdirectory/etc

That seemed to work.

But I also borrowed from the commands use with Rclone Browser:

 rclone sync --delete-during --update --checksum --verbose --transfers 14 --checkers 6 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --dump filters --stats 1s /etc remoteServer:MyB2Bucket/topdirectory/etc

With this last command I’ve been able to UL all the contents of /etc, /home, /opt and /var. However it seems /etc still has errors that stops the process.

Also, it doesn’t seem to matter whether I’m in the file system root or not. It works.

As for the config file, that seemed simple. I created it in the sever using the same process I used locally. I SSH’d over to the server and invoked “rclone config.” That also seemed to work. Just now I searched for the config (on the server) and it returned /root/.config/rclone/rclone.conf as the location. The contents show:

[remoteServer]
type = b2
account = (redacted)
key = (redacted)
endpoint =

So now, on tbe B2 cloud, I have the following structure:

  • MyB2Bucket
    - topdirectory
    – etc
    – home
    – opt
    – var
    – maybe others to come

…continued

[quote]You can use one rclone command to back lots of things up. I’d use the –include-from flag (or maybe the –filter-from flag to set up a list of things you want in and out of the backup, then run rclone from the root of the file system.

So rclone sync --include-from backup-dirs / remote:backup
For backups, also check out the –backup-dir flag
[/quote]

This is certainly a different approach. I was hoping to put a number of rclone commands in the same crontab as I described above.

I’ll have to look at this (and I must admit I’m a bit apprehensive), but I think you’re saying I can just use the -include-from flag for the main directories like /etc /home /opt and /var and it will do the same thing as I’m doing, just all at once? Must I be in the file system root for this to work?

After the “include-from,” would I list list the directories? Do you have any real-world examples of how this might work? (sorry, I feel like a zombie trying to wade through the man page)

You want

rclone sync /etc remoteServer:MyB2Bucket/etc

If you don’t want the contents of /etc inside remoteServer:MyB2Bucket

What errors are they? Permissions errors?

Yes that should work. The transfer will have to start from the root, yes.

The include from is a file. In the file you need to put the directories you want in it, like this

/etc/**
/home/myuser/**

Like that.

(rsync works the same way - I stole the idea from there!)