Firstly.. I'm new here, just tried rclone, and I'm excited. Great tool.
I work in the film industry and the company I work for may have us continue working remotely for a very extended period. I've been educating myself on all the cloud has to offer, so I have some answers when we will eventualy discuss cutting out(or minimizing) our on prem I.T.
cloud storage space(aws,azure,wasabi...) is not imediately useful for us because, it's our workstation applications that need to access the many, many, files(frame sequences). Rclone and vfs cache allows us to use cloud storage the way that makes the most sense for us.
But since we are dependant on large file sets, I figure we'll likely dedicate a ssd to the file cache 500GB-1TB. I'm guessing I admit I have not run a test scenario because I do not have a large cloud storage to test with. Remember, this is mostly a theoretical, that I hope to realize soon.
I would like to find a way to pre cache large folders the night before the user needs them (initiated by the end user)
I am also looking to acheive somthing like the windows "always available offline" a cache that does not expire for that same large folder. because, what's the most devastating problem a remote worker all eventualy experience? ..loss of internet. or you know you're going to be traveling, so cach your project files before you go..
so maybe I'm thinking of a vfs-cache-mode sync... but it can't be for everthing on the cloud of course, only select folders.
Anyway, this feature (or the ability to configure it to work this way) would be the key to having a solution to bring to our remote pipeline. I have checked out lucidlink and they don't appear to have as granular control over file caching as you do. Would love to hear your thoughts.