Guidance: Uploading to Google Drive over a slow, flaky connection

Looking for any guidance, success stories / failure stories / recipes for using RClone to sync to a large folder of hundreds of thousands of 4-10mb files over a slow (think 2G) flaky (regularly drops out every 30-40 minutes) connection.

  • Does RClone handle dropped connections gracefully?
  • Should I restart via cron every 15 minutes?
  • What connection settings are recommended for number of threads / checkers?
  • Can RClone handle starting the transfer, then recognizing when more files are added to the local directory? (or will it need to be run again?)
  • Would speed be helped (or hurt) by keeping all files in one directory, or in sub-directories (new files will generally only be added to newer sub-directories, ie after a few months, older sub-directories will no longer get new files…)

Thanks!

1 Like

It should do yes.

No, rclone should detect timeouts and retry.

Try the defaults - these are reasonably conservative. If things are really going wrong, try --transfers 1 --checkers 1

No not at the moment.

rclone has to do a separate API call for each directory with Google Drive and it can only retrieve 1000 files in a listing at once. It is probably most efficient if you put all the files in one directory, but the directory listing phase isn’t usually the bottleneck so in practice you’ll probably find it makes very little difference.