Welcome. When asking questions or asking for assistance with rclone, it’s helpful to provide relevant details, so I’ll include my current setup below.
Hi everyone,
I’m working in a Web 3.0 blockchain environment and need to regularly back up node data (including full node directories and some IPFS-related storage). The dataset size can grow significantly over time (50GB–200GB+), and the data changes frequently.
Use case:
-
Backing up blockchain node data directories
-
Syncing from local storage to cloud (S3-compatible and Google Drive)
-
Running backups periodically (cron-based)
Command being used:
rclone sync /data/blockchain-node remote:node-backup --progress -v
What I’m looking for:
-
Best flags/settings for handling large and frequently updated datasets
-
Whether
syncis appropriate for this scenario, or ifcopywould be safer -
Recommendations to avoid partial transfers or inconsistencies
-
General strategies for structuring reliable long-term backups
Additional info:
-
I can provide logs with
-vvand--log-fileif needed -
Config can be shared with sensitive info removed
I want to make sure the backup process is reliable and efficient over time, especially since the data is continuously evolving.
Thanks in advance for any guidance!