Is it possible to hash a remote file in-memory?

e.g. use rclone hashsum to hash a file without downloading it to the local disk - just by streaming it into memory in chunks then discarding each chunk?

rclone hashsum

Are you just linking it for convenience or are you telling me something? :stuck_out_tongue:

I was hoping you'd take a minute and read it.

      --download             Download the file and hash it locally; if this flag is not specified, the hash is requested from the remote

Its not entirely clear if that hashes it in-memory or not.

If the remote has a hash, it gets it.
If the remote has no hash, it downloads it and hashes it locally.

It seems unproductive to keep repeating this exchange.

Ok, closing out the topic based on your feedback.

Is there something I didn't answer or doesn't make sense? I'm happy to help here as I've answered your as throughly as I can.

1 Like

in my testing, rclone streams the file in-memory, does not use local disk space.

Part of the reason the whole help and support template exists is to avoid low quality questions like this one.

The OP made a generic question without any details and is annoyed that there wasn't a perfect easy answer.

Low quality in == low quality out.

Some remotes, you might get the hash and download nothing at all because it is in the metadata.

That same remote would act differently if it's a crypt remote and require another command.

Some remotes, the file has to be downloaded and hashed in a temporary location if it's not a streaming remote.

So the answer for the high level question, is "it depends". If there is context, a command, a remote, there can be specific answers.

Sorry for being unclear. I understand that some remotes don't support hashes on their end. I guess I should be asking if there is a way to check/ensure that a file is streamed from a remote, rather than trying to downloaded it.

Is there a particular remote you have a question about rather than going the other way?

Like the SFTP remote will run the checksums on the other side so it won't download.

I think @asdffdsa was testing with S3 and that should chunk in memory. I tested GD/Dropbox and they chunk in memory.

I'd say majority do with some outliers.

uknow me so well......

some do, some do not.
i tested on a do not remote using --download --temp-dir,
rclone did not download to that temp dir.

tried a bunch of backends, sftp, onedrive, crypt, even mega.
used sysinternals process explorer to monitor rclone disk/network usage.
so i wonder why i wonder if there is such a backend that rclone will download to local and then hash

There's no particular remote. The idea is I want to create a script that can be pointed at any large file on the web (hosted wherever) and then generate hash(es) for it, without having to use up my hard disk (either by using up space, just slowing down performance for other scripts or wearing it out).

That's the HTTP remote :slight_smile:

That should work fine and hash in memory.

Okay that's good, but I also mean filehosters like MEGA, google drive, 1Fichier, dropbox, OneDrive etc