I run rclone copy "G:\FOLDER\FOLDER\FOLDER X\" gdrive_crypt:\ -P --transfers 5 overnight and woke up to a series of errors as far as command prompt can see (i.e., I cannot even see all the errors due to the buffer limit). Folder X has 3 subfolders in it and in those subfolders are 16, 19 and 344 split .rar archives. Rclone copied 16, 19 and 240 archives.
2020-03-28 08:05:48 ERROR : 003-subfolder3/archive.part344.rar: Failed to copy: couldn't list directory: Get htt
ps://www.googleapis.com/drive/v3/files?a[deleted]=true: dial tcp: lookup www.googleapis.com: getaddrinfow: The requested name is valid,
but no data of the requested type was found.
2020-03-28 08:05:48 ERROR : Attempt 1/3 failed with 104 errors and: couldn't list directory: Get https:// www.googleapis.com/drive/v3/files?[deleted]=true: dial tcp: lookup www.googleapis.com: getaddrinfow: The requested name is valid, but
no data of the requested type was found.
Just now, I re-uploaded one of the missing archives (.part241.rar) and it uploaded without errors.
(1) What is causing those errors and what to do to prevent them?
(2) Are the files that were uploaded to the cloud (and are visible in rcloneBrowser) uncorrupted for certain?
(3) From what I understand, you cannot view the hashes of files in encrypted remotes... Is there any way to check file integrity of files in encrypted remotes other than re-downloading them? If there is, what resources does that operation use (I assume that it's all done in RAM, rather than on HDD)?
(4) This is unrelated to my main question, but I wanted to know this for future reference. Is it safe to use Ctrl+S to pause file transfers (in terms of potential data corruption)? Can I safely pause transfers in that way for several hours for example (if I wanted to free up my bandwidth during that time)?
(5) Another unrelated question. Is it safe to do other stuff on my PC and online, while rclone is transferring files in the background or is there risk of data corruption?
Windows 7 x64; rclone v1.51.0 (os/arch: windows/amd6; go version: go1.13.7); I have already created an individual client ID for rclone.
Those look like some kind of networking problem - did your internet connection drop or something like that?
If they've arrived then they should be fine.
Use rclone cryptcheck - that will read the start of each file online (to get the nonce) and use that to encrypt and hash your local file. That means reading each of the local files but they aren't stored in RAM so it is a memory unintensive operation.
Yes it is safe to do that. The transfer might break in which case rclone will retry the segment or file it was working on. You'll never get corrupted files on google drive as unless they are finished properly then they will never appear.
Rlcone will check the checksum (using a method like cryptcheck) after it is uploaded.
It is fine! rclone can be a bit disk and bandwidth intensive but it shares nicely and won't corrupt your files.
Data integrity is my number 1 concern with rclone so I appreciate your questions
Thank you very much, Nick, for addressing all my concerns. I'll sleep much better now!
This could very well be the case. Are there any preventive measures that I can take so that rclone doesn't go into a loop of errors? Are there perhaps any flags that would tell rclone to wait a specified amount of time (say, 30 minutes) before making another attempt? Ideally, this would happen after the first connection error, rather than after cycling through all the files to transfer.
That's great. Is data written to the HDD/SDD in this process at all?
BTW I'm currently trying to move files that contain the word Monkey from a folder to a subfolder (on the same remote), but rclone gives me errors. It seems that I can only copy files, but not move. Is this correct? The move code is as follows.