Is there any way to get the already calculated checksum directly from Google Drive?
I mean, I know that i can calculate the hash of each file if I go to the rclone connection and calculate locally from system, but I would like to know if there is a property that I can check to get the checksum of the files without calculate them.
lsjson can do that
For single file:
rclone lsjson --hash gdrive:testfile.bin
For all files in a directory:
rclone lsjson --hash gdrive:mydir
You can also use
rclone md5sum which gives output in the same format as the
Cool, but the true is that I was looking for get the CRC32 checksum hash because it's used by WinRAR and https://www.srrdb.com/
For example, suppose that I don't have the file in my drive, I would check the hash file searching the release in that page and compare it directly with the file in Google Drive.
Actually, I can't because those are different hash types.
Rclone doesn't currently fetch the CRC32 hash - it isn't a great integrity check for large files.
Rclone does support CRC-32 hashes for putio so it could be added for google drive if you really wanted!
Do you mean add a file from Google Drive to Put.io just to check CRC32 hash?
It's not worth it, but it's a workaround.
I was looking for that rclone gives the CRC32 hash, but seems that's not possible.
Thanks anyway, @ncw.
So when you do a rclone move / copy / sync with
-c you don't download the whole file and calculate the checksum?
That is correct, it reads the checksum from the remote cloud storage system
So is there any reason one shouldn't use
-c since is the only one that can guarantee file integrity, and it appears to have no downsides?
It takes longer as it has to checksum the local files which can take a bit of time. That is why modtime is the default