How to sync quickly

Yes

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10

Which cloud storage system are you using? (eg Google Drive)

Google

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone sync "D:\DOCS" "Gdrive:\Docs" --max-transfer 500G --buffer-size=1G --transfers=4  --fast-list --max-backlog 20000 --log-level INFO --log-file D:\Logs\DOCS.txt
start notepad "D:\Logs\Docs.txt"
exit

Ok so this command does the job, but it was processing it for 9 hours before finally starting to check the files. It then checked the files and 2 hours later it is up to 1,000,000 files. I would like it to be faster. Is there a command that would speed this up?

Also would there be a command if I split it up to sync multiple directories in one command? so maybe d:\docs\lib ; d:\docs\pol; d:\docs\rog all in one command?

Sorry if this is the wrong place to post.

create a file, for example list.txt

/lib/**
/pol/**
/rog/**

and run a command like this, to list the files that would be copied.
rclone ls d:\docs --include-from=list.txt

Sorry, it was there, just a problem with the formatting. I fixed it

see my last post...

So my command would be like:

rclone sync ls "D:\DOCS" "Gdrive:\Docs" --include-from=list.txt --max-transfer 500G --buffer-size=1G --transfers=4 --fast-list --max-backlog 20000 --log-level INFO --log-file D:\Logs\DOCS.txt

no, rclone ls would just list the files that would be copied. to make sure the command does what you want
to copy the files, rclone copy
to sync the files, rclone sync

is this a one-time transfer or do you plan to run it on a schedule?

run it on a schedule. oh i think i see my issue. Is this what i would use?

rclone sync --include-from=list.txt "Gdrive:\Docs" --include-from=list.txt --max-transfer 500G --buffer-size=1G --transfers=4 --fast-list --max-backlog 20000 --log-level INFO --log-file D:\Logs\DOCS.txt

if you run it on a schedule.
the first time, run the command as is for a full sync

let's say you run the sync every 24 hours, you can speed up the process by--max-age=25
for each local file, it is is older than 25 hours, rclone will not check the file in gdrive,
yes, i choose 25 to have a little overlap.

if you want a forever forward incremental sync, read about --backup-dir

So it looks like --backup dir is not what i want. So what you are saying is that if I run rclone --max-age=25 it will sync any files that were changed in the last 25 hours?

so I would run this command

rclone sync "D:\DOCS" "Gdrive:\Docs" --max-age=25 --max-transfer 500G --buffer-size=1G --transfers=4  --fast-list --max-backlog 20000 --log-level INFO --log-file D:\Logs\DOCS.txt
start notepad "D:\Logs\Docs.txt"
exit

By the way thanks for your prior and future help!

rclone will sync any file that has changed within the last 25 hours.
for each such file, rclone will check with gdrive and decide what to do.
since this greatly reduces the number of checks, the sync should be quicker.

Just ran my first --max-age 25h, It sped it up from 21 minutes to 15 minutes. Thanks, if you find anything else to speed it up even more please post.

Can i mark this solved?

sure, that is up to you.

note: about sync with --max-age
if, on the source, a file is moved from one folder to another folder and then sync, rclone will not notice that.
the source will no longer match the dest.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.