I’m encountering some pretty high memory usage when performing a sync from the local filesystem to a remote encrypted fs (which itself wraps an azure blob storage account). I suspect this may be because I have quite a lot of smallish files to sync. Key info:
Command being run: rclone --config /path/to/my/version/controlled/rclone.conf --checksum --retries 20 --suffix 2018-10-04 --backup-dir azure-encrypted:/backups.history/ --quiet --bwlimit 384k sync /home/backup-user/backups/ azure-encrypted:/backups
Rclone Version: 1.42
Total size of files being sync’ed: 855GB
Number of files being sync’ed: 1,098,342 (1098342 without locale-specific separators)
The backup is using 5.325GB of physical RAM (12.408GB virtual), which seems pretty extreme. Is it caching the file checksums in memory or something?
Any help/insights would be most appreciated