Command "rclone rc vfs/refresh" fails with an error: Unknown key "fs"
I need to supply an fs, because there are multiple cloud drives. What am I missing here?
What is your rclone version (output from rclone version)
rclone v1.53.2
os/arch: linux/amd64
go version: go1.15.3
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Docker on QNAP
Which cloud storage system are you using? (eg Google Drive)
Various (google drive, dropbox, pcloud)
The command you were trying to run (eg rclone copy /tmp remote:tmp)
Also adding a dir parameter does not make a difference
And the fs does exists. It is the same as in the mount and when I use something that does not exist, I get a different error: no VFS found with name
A log from the command with the -vv flag
2020/11/06 12:41:54 DEBUG : rclone: Version "v1.53.2" starting with parameters ["rclone" "rc" "vfs/refresh" "fs=dropbox:" "recursive=true" "--log-file=log.txt" "-vv"]
2020/11/06 12:41:54 DEBUG : 4 go routines active
2020/11/06 12:41:54 Failed to rc: Failed to read rc response: 500 Internal Server Error: {
"error": "unknown key \"fs\"",
"input": {
"fs": "dropbox:"
},
"path": "vfs/refresh",
"status": 500
}
And when I do the command as you suggest (without the fs parameter), I get the error:
2020/11/06 14:09:59 Failed to rc: Failed to read rc response: 500 Internal Server Error: {
"error": "more than one VFS active - need \"fs\" parameter",
"input": {
"recursive": "true"
},
"path": "vfs/refresh",
"status": 500
}
The dir option is for a path on a mounted remote not specifying a remote.
I'm not sure you can specify a remote, but maybe @darthShadow knows for sure as I don't think you can but I've been wrong before and I'm sure I'll be wrong again
I already suspected it was a bug. Thanks for confirming it.
It will be hard for me to test, as I am currently using it inside Docker on a QNAP. So I'll need a little patience there...
I'll try it out on an Ubuntu machine somewhere next week.
I just mounted 1 single cloud drive to test the vfs/refresh, but no matter what I try, the cache is not prefetched (e.g. I added all kind of longer timeout options and vfs cache read options, but nothing I tried seems to make a difference). This cloud drive has GBs of data, but nothing is prefetched.
I tried the following with the _async option as @Animosity022 suggested.
The command completes without error in roughly 5 seconds but nothing seems to happen.
I thought the files would be precached in the folder .cache/vfs/googledrive/ as when I browse through files, they also appear there (and once they are there, you can quickly browse over the files)
Any clues what I could have forgotten?
EDIT: the log with the -vv flag also shows dreadfully little
2020/11/06 19:05:36 DEBUG : rclone: Version "v1.53.2" starting with parameters ["rclone" "rc" "vfs/refresh" "recursive=true" "--log-file=log.txt" "-vv"]
2020/11/06 19:05:40 DEBUG : 4 go routines active
I've merged this to master now which means it will be in the latest beta in 15-30 mins and released in v1.54. If we make a 1.53.3 then it will go there too!