`Rclone check` output not expected

What is the problem you are having with rclone?

when rclone check cannot check hash

  1. DEBUG : file.txt: OK - could not check hash
    should be
    ERROR : file.txt: OK - could not check hash

  2. NOTICE: S3 bucket zork path dest: 1 matching files
    should be
    NOTICE: S3 bucket zork path dest: 0 matching files

  3. exit code = 0
    should be
    exit code > 0

  4. --combined
    =
    should be
    !

same behavior on window and linux.

Run the command 'rclone version' and share the full output of the command.

rclone v1.57.0
- os/version: ubuntu 20.04 (64 bit)
- os/kernel: 5.10.60.1-microsoft-standard-WSL2 (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.17.2
- go/linking: static
- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

wasabi

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone check remote:zork/source remote:zork/dest -vv --combined=./combined.txt
echo exitcode=$?

The rclone config contents with secrets removed.

[remote]
type = s3
provider = Wasabi
access_key_id = 
secret_access_key = 
endpoint = 

A log from the command with the -vv flag

+ rclone check remote:zork/source remote:zork/dest -vv --combined=./combined.txt
DEBUG : rclone: Version "v1.57.0" starting with parameters ["rclone" "check" "remote:zork/source" "remote:zork/dest" "-vv" "--combined=./combined.txt"]
DEBUG : Creating backend with remote "remote:zork/source"
DEBUG : Using config file from "/home/user01/.config/rclone/rclone.conf"
DEBUG : Creating backend with remote "remote:zork/dest"
INFO  : Using md5 for hash comparisons
DEBUG : S3 bucket zork path dest: Waiting for checks to finish
DEBUG : file.txt: OK - could not check hash
NOTICE: S3 bucket zork path dest: 0 differences found
NOTICE: S3 bucket zork path dest: 1 hashes could not be checked
NOTICE: S3 bucket zork path dest: 1 matching files
INFO  : 
Transferred:   	          0 B / 0 B, -, 0 B/s, ETA -
Checks:                 1 / 1, 100%
Elapsed time:         0.4s

DEBUG : 7 go routines active
+ echo exitcode=0
+ exit
exitcode=0

combined.txt

= file.txt

For some reason that file doesn't have a hash - how was it uploaded? I suspect it was a multipart upload but not via rclone, or maybe using rclone mount.

Rclone checks size and hash if possible, and if it can't check the hash it still checks the size.

And it warns here that it couldn't check 1 hash.

i have been doing testing, in preparation for the --bulletproof enhancements for s3.

one of my test cases is:
if the source or dest file is:

  1. multipart
  2. no X-Amz-Meta-Md5chksum header.

imho, the docs are not clear.
if you think, perhaps we could try to make a change.
"It compares sizes and hashes"
to
"rclone compares by hash (MD5 or SHA1). If hash is missing, then compare by size."

so
file.txt: OK - could not check hash
means
file.txt: OK - could not check hash, size match

in my dreams, --check-hash-only-else-fail-with-ERROR-and-exit-code=10

tho the problem is with --combined output
= could mean hash match or size match, but not, for certain that hash match
again, in my dreams, there would be
== hashes match

in the real world, i can script around it.
my python script that scans the rclone debug log,
will already complain if 0 differences found is not found

so i need to update the script to complain if found
could not check hash or hashes could not be checked

thanks

A flag like this would be easy to implement if we can think of a sensible name for it :slight_smile:

Say rclone check --error-if-hash-missing?

This would then cause the = to become ! and rclone to exit with an error if any files did not have a hash to check.

thanks for the offer but given that i:
--- can script around the issues
--- have not seen other rcloners request these features
--- know you have limited time
--- prefer to dream about having --s3--bulletproof fully implemented

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.