How can I run 2 commands to transfer same folder and continue stats from first command?

I wish to transfer a folder to my Google Drive and transfer 3 files at once while ignoring images files.

I'm then running a different command to transfer the same folder but this time only images files will be left so it's only transferring images so I want to change to transferring 8 files at once. I'd like the stats from the first command to continue with the 2nd command so if 100 GB of 110 GB has been transferred by the end of the first command then when the transfers start in the 2nd command it will continue the stats saying 100 GB of 110 GB has been transferred so far. How can I do that? Here's my script:

rclone move ~/private/rtorrent/data/"zTransfersToGoogleDrive7" SharedDrive7:/"" -v --transfers=3 --stats=5s --drive-use-trash --delete-empty-src-dirs --exclude "*.jpg" --exclude "*.jpeg" --exclude "*.png" --exclude "*.gif" --exclude "*.bmp" --exclude "*.tif" --exclude "*.tiff" --exclude "*.webp" --exclude "*.svg"

rclone move ~/private/rtorrent/data/"zTransfersToGoogleDrive7" SharedDrive7:/"" -v --transfers=8 --stats=5s --drive-use-trash --delete-empty-src-dirs

as per docs:

Important Avoid mixing any two of --include..., --exclude... or --filter... flags in an rclone command. The results may not be what you expect. Instead use a --filter... flag.

so in your example:

rclone move ~/private/rtorrent/data/"zTransfersToGoogleDrive7" SharedDrive7:/"" -v --transfers=3 --stats=5s --drive-use-trash --delete-empty-src-dirs --exclude "*.jpg" --exclude "*.jpeg" --exclude "*.png" --exclude "*.gif" --exclude "*.bmp" --exclude "*.tif" --exclude "*.tiff" --exclude "*.webp" --exclude "*.svg"

it is better to use:

rclone move src dst --filter "- *.jpg" --filter "- *.gif"

or use --filter-from flag and put all rules into file

Stats only apply to current rclone command. There is no way to carry on stats from previous command.

Thanks. Is there any way in a single command to do 3 transfers at once until the images need to be transferred and then it should switch to 8 transfers at once?

Nope - what is the problem you are trying to solve?

Just run two commands.

But you know how much you transferred - just add two numbers together:)

I have multiple transfers queued so I've no idea how much has been transferred so far and I'm not watching the SSH window all the time to check.

When I run 2 different commands the stats reset which tell me how much I've transferred so far. That's why I wanted to fit it all in 1 command. But I guess that's not possible?

I've noticed that when a transfer starts it will say there's 740 GB to transfer for example. It will say that for hours in the stat updates but then in the last hour or so on the odd occasion it will suddenly say there was 770 GB to transfer. Why couldn't it calculate that at the beginning of the transfer and is there any way for it to be more accurate at the start?

That's why I don't want to do 2 different transfer commands because of that glitch. Because if the second transfer command say there's only 200 GB to transfer and I get that glitch then I won't know that the glitch is responsible for my transfers stopping because I exceeded the Google Drive daily transfer limit. Whereas if I transferred everything in 1 command then I can clearly see that if 770 GB is to be tranferred then I clearly exceeded the transfer limit which is why my transfers have stopped.

To make your life easier run something like this

rclone move src dst --max-transfer=740G --cutoff-mode=soft

and restart it everyday (you can do this in crontab).

this way you have nothing to do. Just check from time to time if all working.

You can also run your customized moves with --dry-run and get estimate of transfer size in advance. Then you can use --max-transfer to make sure you do not hit any limits.

Actually it can calculate everything from the beginning:)

rclone move src dst --max-backlog = -1 --check-first

It will first calculate total number of files and their size then only start transferring.

You will need about 1 KiB of RAM per file to do this. so if you have 1 million files you need 1GiB RAM. If you have 50 millions you need 50GiB or RAM

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.