In rclone sync or copy checkers do there work but this at times could run longer than transfers.
Eg: rclone sync with millions of files where just about 50 new files need to be transferred.
Is there a possibility to create a new rclone backend for checkers like hasher.
Role: To index files in the base backend (eg:S3) containing details like location/filename, size, timestamp, etc. Can be stored in .cache/rclone/s3.index
Usage: used like hasher.
Advantage: if 2 index remotes are created eg: one for S3 and other for Dropbox.
rclone sync -Pv s3indexed: dbindexexed:
The above comand will quickly read both the index files in .cache/rclone/*.index and lookup new files in S3 to transfer to Dropbox.