Best way to backup Web 3.0 blockchain node data using rclone?

Welcome. When asking questions or asking for assistance with rclone, it’s helpful to provide relevant details, so I’ll include my current setup below.

Hi everyone,

I’m working in a Web 3.0 blockchain environment and need to regularly back up node data (including full node directories and some IPFS-related storage). The dataset size can grow significantly over time (50GB–200GB+), and the data changes frequently.

Use case:

  • Backing up blockchain node data directories

  • Syncing from local storage to cloud (S3-compatible and Google Drive)

  • Running backups periodically (cron-based)

Command being used:

rclone sync /data/blockchain-node remote:node-backup --progress -v

What I’m looking for:

  • Best flags/settings for handling large and frequently updated datasets

  • Whether sync is appropriate for this scenario, or if copy would be safer

  • Recommendations to avoid partial transfers or inconsistencies

  • General strategies for structuring reliable long-term backups

Additional info:

  • I can provide logs with -vv and --log-file if needed

  • Config can be shared with sensitive info removed

I want to make sure the backup process is reliable and efficient over time, especially since the data is continuously evolving.

Thanks in advance for any guidance!

If you are are planning backup then my suggestion is to use the right tools. rclone is not backup software. It is only tool to shuttle data between different remotes. Of course it can be a part of some backup solution but why to reinvent the wheel and not use freely available backup software? I would recommend restic or rustic. They even can work with rclone to connect to some exotic providers if needed.