How do I generate permanent hashes for a crypt remote to compare against a source Google Drive?

How do I generate permanent hashes for a crypt remote to compare against a source Google Drive?

Let's say I have GDrive1, and Crypt-GDrive1. Crypt-GDrive1 has files that were copied from GDrive1.

How can I generate hashes (permanently) for the crypt drive files and have them compared against the native MD5 hashes on GDrive1?

There is a function called cryptcheck but that's a one-time on the fly generation - I want to keep the hashes cached for the future.

Hasher does not seem to work. Should I be using the function hashsum?

Really lost here, could use some help.

There are two ways I know:

  1. Hasher - stores hashes in local database, good solution if used from one computer only. Supports any hash known to rclone.

  2. "Creative" use of chunker

As per docs:

If your storage backend does not support MD5 or SHA1 but you need consistent file hashing, configure chunker with md5all or sha1all. These two modes guarantee given hash for all files. If wrapped remote doesn't support it, chunker will then add metadata to all files, even small. However, this can double the amount of small files in storage and incur additional service charges. You can even use chunker to force md5/sha1 support in any other remote at expense of sidecar meta objects by setting e.g. hash_type=sha1all to force hashsums and chunk_size=1P to effectively disable chunking.

I am using chunker set at 1P size.

Depends what you mean by "permanent", but for some purposes it can be a helpful workflow to generate a a "sumfile" using rclone hashsum with --output-file, then save this file and use it in the future to check against a live remote with the --checkfile option in rclone check. (May also want to add the --download option if detecting bit rot is your goal.)

What is not working about it? If you post details maybe we can help.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.