Command to check how much data needs to be downloaded from remote

What is the problem you are having with rclone?

I am writing a script to automate the rclone sync process and need to see the amount of data that needs to be synced between the cloud and my local server. I thought about using the rclone size command on the cloud and on my local server and comparing them but that doesn't accurately reflect the amount of data that needs to be synced. Let me explain:

If I have 2TB on my cloud remote and 2TB on my local server that are synced up but then I delete 200GB from the cloud and upload 300GB to the cloud from a different server. Running the Rclone size on the cloud and local server will yield 2.1TB and 2TB respectively showing only 100GB to sync when in reality, 300GB needs to be synced and 200GB needs to be deleted.

I need a command that gives me the 300GB to be downloaded.

What is your rclone version (output from rclone version)

rclone v1.54.1
-go version: go1.15.8

Which OS you are using and how many bits (eg Windows 7, 64 bit)

centos 8 on the offsite server

Which cloud storage system are you using? (eg Google Drive)

google drive via crypted remote

The command you were trying to run (eg rclone copy /tmp remote:tmp)


Thanks in advance!

If you do rclone sync --dry-run -v it will tell you at the end how much data needs to be transferred - is that the info you are looking for?

That is what I'm looking for and I actually tried that but it doesn't seem to show me the amount of data to be transferred. It shows all of the files to be transferred but the amount of data seems to stay at 0 when --dry-run is used. See below:

I think that feature went into rclone v1.55

$ rclone sync --dry-run . /tmp/rclone
Transferred:   	    1.461G / 1.461 GBytes, 100%, 10.169 GBytes/s, ETA 0s
Transferred:         3616 / 3616, 100%
Elapsed time:         0.1s

This is exactly what I needed. Thank you as always.

BTW I am using rclone to transfer all of my data to a new backup server that I built. The data is coming from google drive. It was pretty incredible to transfer 20TB with 1 simple terminal command and not have to worry about it. You have made an incredible tool!



This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.