One-time backup of a Linux server to Amazon Drive

Hi everyone.

I have a Linux Debian server with terabytes of data on it (approximately 40TB).
I would like to make a one-time backup of the directories home, etc, root, usr and var to Amazon Drive to a subdirectory what I would call ‘backup2’. The backup task should run automatically in the background.

Must not be encrypted, but would be good. Is this possible? If so, how?
I am completely inexperienced with rclone and I’m not sure if this is a good idea.

I personally would write a tar file for the root/opt/usr/etc since those shouldn’t be that large anyway rather than attempting a direct clone. Otherwise you’ll have to reconstruct permissions, links, ACLS, etc.

For strictly data you can probably just copy files if they’re not in use but you still have the problem of permissions and such. So the question would really be what kind of data you have.

Hello @Julian,

+1 for backing up to a tarfile first. If you don’t have the local disk space for that, consider using something rpipe to pipe the tar directly to your ACD remote.

I’ve had some (truly bad) experience with Amazon when trying to backup local directories directly (file name length restrictions, banning/bottlenecking on my transfers, etc). And to top it off, it seems rclone itself does not work very well when you are backing up a large quantity of relatively small files (for example, even to GDrive which has much less issues than ACD, I still haven’t managed to backup and verify my 7M files / 5GBytes local server dir tree – latest issue here, but not by any means the only I’m struggling against).

rclone works wonderfully for small-and-medium-sized trees of relatively large files, but for large trees of relatively small files, it’s simply not there yet IME.

Cheers,

Durval.