How --drive-upload-cutoff and --drive-chunk-size works?

I need to copy from amazon cloud drive to google drive
milions of files in variable sizes.
Some are 1KB, some are 3GB, most is from 1MB to 700MB.

What command would be best?
how to use --drive-upload-cutoff and --drive-chunk-size works to use full network bandwitch?

how --drive-upload-cutoff and --drive-chunk-size works?
if drive chunk is for example 16M, and cutoff is 1B, does it mean, that rclone will take all files above 1B, load it into 16M buffer and then upload to gdrive?

If you have lots of small files then increasing --transfers and --checkers will help.

Above the upload cutoff rclone will use a multi-part upload which is less efficient that uploading it directly, but better for uploading big files. When doing a mult-part upload the file is buffered into memory using the chunksize so you don’t want to make it too big otherwise you will run out of memory! If you have lots of memory then you can increase the chunksize for increased performance for big files, eg from my tests

2 Likes