welcome to the forum,
that is a small amount of data and a lot of small files.
maybe, switch from stone-age sftp to a modern protocol such as s3 and a minio server.
which will be much faster and does support --fast-list
note: --fast-list
does nothing on sftp which makes sync slow
rclone backend features sftp: | grep "ListR"
"ListR": false,
not sure there is a great solution, but quickly, off the top of my head.
to create a list of files, do something like
rclone lsf jenkins-vm:${REMOTE_REPO_PATH} --files-only --absolute --recursive > file.lst
from each machine, to sync the files, download file.lst
, then do something like
rclone sync jenkins-vm:${REMOTE_REPO_PATH} . --files-from=file.lst
maybe, to work around, sftp being slow with sync, can try
https://forum.rclone.org/t/big-syncs-with-millions-of-files/40182