I am running a crypt+VFS rclone mount of my GDrive and think some files exist on my Drive that don’t appear in the mount. I had mistaken the size command for a different one (I thought it was info so I ran it on my vfs remote and it ended up creating files on Drive that weren’t seen by the mount (which makes me think that they weren’t encrypted and are corrupted). I deleted them manually since they were in the main directory but it makes me wonder if there’s more in maybe subdirectories like so gdrive:folder1/folder2/folder3.
Is it possible to find files that exist on GDrive that aren’t in the crypt? Or empty files on the crypt or raw gdrive? Also maybe if there’s a way to show files that give back the error of bad password? with bad blocks?
I haven’t seen an issue where a file is missing but I tend to deal with plex related files rather than little things. The longest delay you should see for something to appear is 1 minute as that’s the polling interval
I wouldn’t recommend doing it but I ran that (incorrect) info instead of the correct size command and it created empty files in the root crypt remote with random character filenames appearing on GDrive but not on the mount even after clearing the cache and remounting. If I can search through the regular gdrive remote before crypt and find empty files, maybe that can fix the problem possibly?
My bad, I mean refreshing the VFS cache, not clearing. I run a VFS refresh over rc and it would still show the files. But now after they are deleted, I get errors on the API for listing and it happens with every refresh which makes me think that maybe residual files exist from this command error I made.
I don’t have a log unfortunately of it. I remember when I ran it, It came back with some debug info so I quickly control-C’d out of that fast and ended up with the files. I’ll try ls and see if any empty files appear or possibly delete all files made on the day of the error.
No errors on executing that command oddly enough. Although before adding a tpslimit, I did get many Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. I have the default quota settings. I’m wondering if maybe that’s the cause of the errors, going over the api limit causing other stuff like playback to come to a crawl. I use analyze on Radarr and Sonarr as well which is nice to have but not if it causes higher transactions per second that make it go over the limit.
403 rate limits are very common. Are you using your own API key? If you don’t limit the rates a bit on Google, you get those. Rclone just retries though.
If you go more, you get rate limited which tells rclone to back off, which slows it down. You have to find that sweet spot of just enough but not too many. More isn’t always better.