What is the problem you are having with rclone?
I've got some files stored on Google Drive encrypted with gocryptfs from before it introduced the deterministic file names option. This means that when listing directories there is some overhead if the gocryptfs.diriv files aren't cached. For this reason, when I use the Google Drive desktop app I set it to keep the .diriv files always available offline. What would be the best way to keep these diriv files cached when I'm using rclone mount
on Windows instead? I was thinking maybe a workaround of running a scheduled task to list the directories in the background before --vfs-cache-max-age
passes?
Run the command 'rclone version' and share the full output of the command.
rclone v1.60.1
- os/version: Microsoft Windows 11 Pro 22H2 (64 bit)
- os/kernel: 10.0.22621.819 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.19.3
- go/linking: static
- go/tags: cmount
Which cloud storage system are you using? (eg Google Drive)
Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
Drive mounted via Windows service set to
C:\ProgramData\chocolatey\bin\rclone.exe mount gdrive: Z: --config C:\rclone\rclone.conf --vfs-cache-mode full --cache-dir J:\Cache --vfs-cache-max-age 24h0m0s --log-file "C:\Users\Nter\AppData\Local\Temp\ZDriveMapperLog.txt"
The rclone config contents with secrets removed.
Editing existing "gdrive" remote with options:
- type: drive
- client_id:
- client_secret:
- scope: drive
- token:
- team_drive:
A log from the command with the -vv
flag
N/A