Hi there guys, I’m having some issues trying to compare hashes (md5 or etag) from local files with a remote s3 bucket.
My files are bigger than 5Gb and this why this is why I’m uploading as multipart, but the flag:
seems to not be working in current version (I guess if it was an independent branch from the original version…)
I have tried the following command, as the files uploaded are bigger that 5Gb and every sync to s3 is sloow.
rclone sync -vv --dump-headers --s3-chunk-size 100M origin_folder/ s3:s3-bucket/upload/
I have a Windows and many many files, so I have found so many python functions to achieve this but not a simple PowerShell to automatize that.
Experienced users of rclone, How do you do check the Data Integrity from a remote file against the remote bucket in s3?
Thanks you very much folks!