I mentioned on the Github that I’m having trouble uploading files via rclone mount (1.46, and tested with latest beta today) to an encrypted gdrive on Windows 10. The drive becomes inaccessible for reading until the upload task finishes. When the problem happens, the drive will just show up like this (unreadable), and there’s nothing you can do about it.
My mount command:
rclone mount gcrypt: G: --allow-other --bind 192.168.1.123 --dir-cache-time 72h --fast-list --cache-chunk-path=X:\rclone-cache --cache-db-path=X:\rclone-cache --log-level INFO --vfs-cache-mode writes --cache-tmp-upload-path=X:\rclone-upload --cache-dir=X:\rclone-cache -P --drive-chunk-size 32M --buffer-size 256M --vfs-read-chunk-size 128M --vfs-read-chunk-size-limit off --rc
(I know some of those flags might be redundant when I’m just using normal crypt+drive vfs, but don’t think they’d be the problem)
rclone.conf is default other than client key and
use_trash = false.
I have my stuff download into a temporary folder from Sonarr, then be moved into the GDrive mount when it has finished downloading. Then Plex streams from the GDrive mount. But when Sonarr is doing any uploads (particularly if there are several files uploading at once), it almost always locks up the mount until the uploads finish, so you won’t be able to stream anything on Plex until that finishes.
There is nothing useful in the -vv log when this situation happens. It simply keeps printing:
2019/03/22 10:59:16 DEBUG : <encrypted filename> : Sending chunk 1409286144 length 33554432 every 3 seconds. Once the upload finishes, stuff resumes working, but it’s a huge problem for me that it can’t simultaneously read stuff while uploading.
Is there any real workaround to this that I could try? (I would prefer to avoid hacking up something to use
rclone copy with Sonarr) I have 1000/50 connection, so it can become quite bogged down when uploads happen – but normally everything else works amazing. Thanks for any ideas!