Backing up several partitions at once

Hey there!

What is the problem you are having with rclone?

I'd like to know if there's any way to run a single instance of rclone rcat while piping through the information from dd with multiple disks in a live Ubuntu environment.

I would like to do something similar to this command (dd if=/dev/sda of=- | rclone rcat yourremotename:path/to/disk.raw) while incorporating the Web GUI mostly so that I can remotely check progress of a full system backup, such that I can run a single command incorporating both rcat and rcd, but I'm not sure how would I go about piping different streams at the same time with a one-liner (if it's even possible).

If this was not an option, is there any way to have a script that invokes the example given several times, but a single Web GUI that can check on the progress of several instances at once?

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.2

  • os/version: ubuntu 22.04 (64 bit)
  • os/kernel: 6.5.11-8-pve (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.21.6
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

BackBlaze B2

The command you were trying to run (eg rclone copy /tmp remote:tmp)

dd if=/dev/sda of=- | rclone rcat Backblaze:sda.raw --rc-allow-origin --rc-web-gui-update --rc-web-gui-no-open-browser --rc-user=rclone --rc-pass=REDACTED --rc-addr=:5572 --syslog --cache-dir=/opt/rclone/cache --checkers=16 --human-readable --low-level-retries 20 --retries 20 --retries-sleep 5s --metadata --multi-thread-streams 1024 -vv

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[Backblaze]
type = b2
account = XXX
key = XXX
download_url = https://static.domain.tld
upload_cutoff = 5Gi
chunk_size = 5Gi
memory_pool_use_mmap = true

why would you like to create one raw file from multiple disks? Not sure it would have any practical use.

I was thinking more along the lines of creating multiple raw files in a one-liner, one per each disk.

But given they are not files yet, I'm not sure if the filtering options would work in this case.

One liner - nope.

Now even if possible to use rclone to send disk images to cloud IMO it is the last problem you have to worry about when creating live system images.

Why not to look at some specialised software like veeam?

Or if you want to create your own solution (which rclone can be part of) I would use ZFS filesystem -> stable snapshot -> zfs send | rclone rcat. Definitely not dd.

In this case it would be used mostly to "quickly" get a one-time backup for analysis at another location, so while Veeam is probably the right solution long term, right now I'd only need to get the copy without even mounting the drives' filesystem.

From what I'm understanding the right (?) way to handle this would be to run a pipe to rcat per drive and adjust accordingly, but I'm still a bit worried about how to keep tabs on progress remotely if needed.

Is there any way to have the web GUI check on each of the rcat instances simultaneously?

Use different --rc-addr= ports for every instance if you need web UI.

But you can just add --progress flag and see progress in your ssh session (use tmux for long operations).

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.