Rclone md5sum ignores big files

Hi,
I payed attention, that rclone md5sum command ignores big files. “Big” - is not really big, say, couple of MB.
Why ?
thank you.

Which cloud storage?

Some cloud storage eg S3 don’t support md5 for files which were uploaded in chunks.

Hi, ncw,
You are right, it’s S3, sorry for forgetting to specify it. So pity, that this fact is not mentioned in rclone docs,tough I may missed it.
But if so, another question is : I synchronized (“rclone sync”) two buckets. As far as I understood, default upload chunk size is 96M. I may suppose, that small files (up to this size) will be copied not in chunks.
Is there a way to say rclone (either sync or copy) to upload files not in chunks?
Thank you.

https://rclone.org/s3/#multipart-uploads mentions the no md5sum.

There isn’t a chunk size or threshold option for S3 which there are in the other remotes.

I think the threshold that rclone uses is 5MB. Note that the S3 uploader uploads parts in parallel and needs to buffer each one in memory.

I could make a threshold where it reverts to the standard single part uploader which will be less efficient but will have correct md5sums.

If you’d like to see this can you make an issue? Thanks.

Well, i think I understand.So, if it’s possible to implement less effective, but more comprehensive command, then I’d prefer you to implement copying md5 checksum within HTTP headers.:slight_smile:
Is it doable ?
Thank you.

Hi again, ncw,
I understand, that my question is answered “NO”.
Thank you anyway.