Syncing S3 to multiple local drives

What is the problem you are having with rclone?

I am trying to backup a 50TB S3 disk to a set of local 16TB drives. I do not have hardware RAID and would like rclone to handle transferring the data across the multiple disks. Is there a specific set of commands where I can append multiple disk/folder paths to let rclone consider them as one backup location when running the sync?

Run the command 'rclone version' and share the full output of the command.

rclone v1.59.1

  • os/version: darwin 12.6 (64 bit)

  • os/kernel: 21.6.0 (arm64)

  • os/type: darwin

  • os/arch: arm64

  • go/version: go1.19

  • go/linking: dynamic

  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

S3

I'd make a set of include files to split the data over the disks.

You can use

rclone about --include-from disks.txt remote:

To see how much data you've got in the include file.

Then once you've split it evenly then use rclone copy or sync with the same --include-from file


I guess, if you could mount all the disks at once, you could run an rclone union over them with a policy to fill the disks evenly. That would work too but you'd have no control over which files went on which disk.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.