Rclone Move - Tweak help

  1. The full command you’re attempting to use.
    "G:\Rclone\rclone.exe" --config C:/Users/Shenniko/.config/rclone/rclone.conf move --verbose --transfers 4 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --log-file="C:\Users\Shenniko.config\rclone\Rclone_Log_Files\Media_vfs2.txt" --fast-list --max-backlog=100000 --tpslimit=10 --delete-empty-src-dirs --delete-after --no-traverse --progress "F:/Plex/TV Shows/test" "gdrive_media_vfs:TV Shows\test"

  2. A logfile of rclone’s output with personal information removed.
    Output works fine..

  3. The rclone config you’re using.
    https://pastebin.com/xS1ij9TX

  4. What version of rclone you’re using.
    G:\Rclone>rclone version
    rclone v1.48.0

  • os/arch: windows/amd64
  • go version: go1.12.3

So now that's all out the way.. few questions..

I have about 4-6TB of files to upload to my Encrypted Gdrive... so looking for the best way to move all of these, and delete them from my local drive once they have uploaded.

At the moment, im testing one folder in particular, and it works.. Im just looking for some advice on how to tweak my command to upload a large folder structure.. Is using Rclone via command line the best option? (I know there is Rclone Browser) or is there another preferred way of uploading large folder structures?

Thanks

Shenn

I'd recommend using the commandline if it is a really big batch-job, simply because it gives you full control and you won't need to run everything through a cache (if you used the mount instead).

The webUI basically runs the same commands as commandline (only speaking to the RC), so as long as it supports what you need (I don't know if it covers all options yet - probably not) it should be fine as alternative - but be aware that the webUI is still the experimental stage so if you can't afford anything to go wrong then maybe now is not the best time to test it for such a large job.

I would simply recommend you use an rclone move for this
rclone move C:\sourcefolder\ Myremotename:\targetfolder
I would probably also use
--fast-list
--transfers 8 (will help a bit for when you have many small files)
--drive-chunk-size 128
(or 256 if you can afford 256x8 MB of memory being used). This will increase upload bandwidth utilization greatly - especially on faster connections)
-P
(shows progress of transfer, this is the same as --progress)

I think that covers the most important flags for a job like this.

I would also urge you to keep in mind how many files you are uploading. Gdrive will be slow handling tons of tiny files as it can only start writing a new file about 2/sec. Check the size of your folders and see if there are some obvious culprits with tens of thousands of small files that you really don't expect to need individual access to anyway. These folders I'd suggest considering zipping before upload as it will make both the upload and the eventual re-download a hundred times faster.

Lastly, be aware that there is a 750GB daily upload quota, so if you can do that in a day you will hit a point where the transfer stalls. You can about (CTRL-C) at any time and simply run the same command again tranfer the remainder.. Files will never get lost this way. They don't get deleted locally until rclone confirms they have arrived on the Gdrive.

Lastly, you may also want to be aware of:
--dry-run
(just simulate the job and what happens, but don't actualyl transfer or delete anything)
-v
(verbose output: you'd probably want to use that too with a dry-run just to get more information)

I hope this gives you a place to start, and let me know if you need more help.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.