I was advised in the thread above that I should open a new thread.
My backup rclone copy command needs to check 1.5 Million files each day, and for about 2 hours it seems to be to check if there are new files. Only after 2 hours it starts transfering new stuff.
Is there a way to speed up the process of it?
2018/11/30 16:03:57 INFO :
Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Checks: 0 / 0, -
Transferred: 0 / 0, -
Elapsed time: 2h15m2.2s
This is my command:
rclone copy /share/CACHEDEV1_DATA gcrypt:shared --filter-from /share/CACHEDEV1_DATA/rclone/uploadfilter.txt --copy-links --checkers 3 --fast-list --log-file /share/CACHEDEV1_DATA/rclone/backupdata.log -v --tpslimit 3 --transfers 3
How could I speed things up?
--fast-list. It will use a lot more memory but it will be much faster.
You could also try the
--no-traverse branch and do a command like this just to send over changed stuff
rclone copy --no-traverse --max-age 24h /path/to/local gdrive:path
That will be very quick if not many files changed.