Rclone Best Practice Questions

Hi all,

I have discovered Rclone, and am very impressed with its functionality. I’m getting ready to start using it properly, and copying large amounts of data, I just wanted to ensure I am doing it correctly.

My setup is:

  • Google Drive (Master)
  • Local HDD (2 Partitions, 1 is Backup Sync of google drive, 1 is Local Media)
  • Amazon Cloud (Backup Sync of Google Drive and Backup Sync of HDD Local Media )

So as you can see Google Drive contains all my personal data, and I want that backed up Locally and on the cloud in case of a google account issue. And I would like my Local Media synced or copied to my Amazon Cloud Drive.

  1. With the initial data copy is it best to use the rclone SYNC command or the COPY command. And are there any extra parameters which could speed up the process. I would assume to keep everything in sync, I would need to use the SYNC command going forward?

  2. Additionally, if I’m doing a Cloud -> Cloud command, I know Rclone will stream the data, but it will still end up downloading each file to my PC, uploading it, and then deleting it or similar, therefore I’m limited by my down/up speeds. Would it be best to rent an AWS/DO server for a few hours, to do the initial data transfer, and then destroy it.

Many thanks all

Not sure how much data you have, but I had about 500gb in ACD that I wanted to transfer to Google Drive. The free version of multcloud gives you 2TB of cloud-cloud transfer, although their speed was in the 500kbps range. This worked fine for me to get through the initial transfer, now I just do incremental syncs with rclone for recent changes.

Good point. Cloud to cloud I have around 20gb of data. Nothing compared to most people here!

If you only have 20GB, then it’s probably easiest to do a sync from your computer and let the data transfer through your connection. It should finish overnight if you have a 10-20 Mbit upload connection.

If you had more, then a VPS would be the best option. I use Digital Ocean, and I’ve transferred 7TB at one time. Very fast, and they didn’t cut me off once my bandwidth allotment had been reached.

1 Like

Okay thanks @chriscraig.

In terms of the copy. I’ve heard I may have to disable traversing on google drive? Is that right?

I didn’t use that flag when I did my transfer, but it won’t hurt anything if you do use it. I’m not sure, but I believe it just traverses the entire directory structure on the remote before it starts a copy. I’m not really sure what purpose it serves.

no_traverse can be used if your copying a small number of files to a larger remote so it doesn’t have to traverse the entire remote. It is inefficient to use it to sync a remote.