How To Rename Rclone crypt Folder on Gdrive?

I have a gdrive crypt . Lets say I want to rename Gdrive:/Sleep/files to Gdrive:/Sleepover/files . How can I do this?

Rename it in a mount or use rclone moveto, so

rclone moveto Gdrive:/Sleep/files to Gdrive:/Sleepover/files

should do it.

I have another question, when I use cryptcheck to check my Crypted Gdrive, it show this error
Error computing hash: failed to hash data: unexpected EOF
Do you have any idea about this?

This is normally some kind of network error. Does it do it consistently for a given file or randomly?

Consistently for a given file. I rerun the cryptcheck again to the same file,the same error shows up again. I'm using 1.50.1 (this is teamdrive(uncrypt) --> gdrive crypt)

But I assume that is a successful copy, because those are large file, those file size are correct

Can you have a go with -vv and post any logs which mention the troublesome file name?

After doing some searching, I found out that rclone cant check hash for crypted file

It can check them with rclone cryptcheck though.

I rerun with -vv again and this is the result.

2019/11/13 13:19:18 ERROR : Error computing hash: failed to hash data: unexpected EOF

In that directory, it has 10 files, sometime only 1 file show this error, sometime few files show this error (those file already pass cryptcheck before)

This is almost certainly some kind of network issue. The "unexpected EOF" messages means the stream broke earlier than expected.

yeah, I agreed with this, because some troublesome file after you rerun again it became no problem again, those files that no problem before it became EOF. So I'll count this as network issue maybe my wifi card get too hot and stop working ??

Thanks for your answer :handshake::handshake:

1 Like

If you use the cache backend, an unexpected EOF can very often indicate that you at some point changed the size of the chunks but did not clear the cache afterwards (this is required but often missed by users). This leaves behind chunks of inappropriate sizes that confuses the cache backend a lot, leading to this exact error on spesific files. The solution to that would be to delete the cache and let it rebuild.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.