Reducing upload sizes


Is it please possible to consider reducing upload sizes by implementing:

  1. Deduplication (if possible - although complicated as working with files rather than chunks).
  2. Compression (This would be easier and zstd 9 gives me amazing results)

This is already done by some backup programs, but they don't have the sync or other features of rclone for uploading to cloud. Some include:-

  3. (This can use rclone as backend for backups, but can't sync and has the worst file reduction compared to borg or duplicati.

I know there have been other compression requests, but combining with deduplication and seeing how borg and duplicati do may assist. They produce amazing results, but unfortunately no sync and less backends.


A compression remote is in the works already:

De-duplication in the way that assume you mean does not exist currently, however you may find --copy-dest and --compare-dest useful even if they aren't quite the overarching de-duplication you are probably thinking about.

You can make formal feature requests here (so they aren't just forgotten in a forum thread that grows old):

But you should absolutely see if there already exists an issue for it an if so - upvote that rather than creating a duplicate issue about it...

Of course you can use any third-party deduper software via a mount if you wish... but that's more about cleaning up files after the transfer than saving space in transferring them.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.