Ok so this command does the job, but it was processing it for 9 hours before finally starting to check the files. It then checked the files and 2 hours later it is up to 1,000,000 files. I would like it to be faster. Is there a command that would speed this up?
Also would there be a command if I split it up to sync multiple directories in one command? so maybe d:\docs\lib ; d:\docs\pol; d:\docs\rog all in one command?
no, rclone ls would just list the files that would be copied. to make sure the command does what you want
to copy the files, rclone copy
to sync the files, rclone sync
is this a one-time transfer or do you plan to run it on a schedule?
if you run it on a schedule.
the first time, run the command as is for a full sync
let's say you run the sync every 24 hours, you can speed up the process by--max-age=25
for each local file, it is is older than 25 hours, rclone will not check the file in gdrive,
yes, i choose 25 to have a little overlap.
if you want a forever forward incremental sync, read about --backup-dir
So it looks like --backup dir is not what i want. So what you are saying is that if I run rclone --max-age=25 it will sync any files that were changed in the last 25 hours?
rclone will sync any file that has changed within the last 25 hours.
for each such file, rclone will check with gdrive and decide what to do.
since this greatly reduces the number of checks, the sync should be quicker.
note: about sync with --max-age
if, on the source, a file is moved from one folder to another folder and then sync, rclone will not notice that.
the source will no longer match the dest.