ls -al /home/xx/.cache
total 16
drwxrwxrwx 4 xx xx 4096 Mar 31 10:21 .
drwxr-xr-x 22 xx xx 4096 Apr 3 14:35 ..
drwx------ 2 xx xx 4096 Mar 30 11:38 dconf
drwxrwxrwx 2 xx xx 4096 Mar 31 15:15 rclone
Yeah, I am not sure what you are trying to do still.
If you are trying to set vfs cache mode to anything other than off, it has to download the whole file before streaming so that's not a good option to use.
I am not sure what you mean by the 'cache one' works well.
Why are you trying to minimize API hits? You get 1 billion per day.
@ncw is rewriting the vfs cache backend as the 'cache backend' has no maintainer so I would not really use that.
There are multiple Windows users here (as I'm not one of them) that stream via rclone without issues. Perhaps @VBB can chime in as he's got his settings working quite well to my understanding.
If the goal is to keep recently written files in the VFS cache, you can use 'writes' as the mode as anything written would be kept for 72 hours per your settings. This means that once you write something, it doesn't return back to you until the file is uploaded to your remote.
--buffer-size depends on how rclone reads ahead but that is dependent on how the file is opened and closed as buffer-size is discarded when a file is closed.
You can try out the cache backend as that keeps things local, but again, that does not have a maintainer so it has not been fixed/updated for some time.
That's also the cache backend with the same caveats above. You can use --writes in VFS but that's only when a file is written and nothing for reading. You do not want to use anything above writes as that means it has to download the entire file before giving you 1 byte of data back.
Google doesn't ban really so not sure what you mean by that. You only get banned if you violate their TOS.
Google does have daily API quotas per user, which is 1 billion per day. They have a daily download 10TB and upload of 750GB per user. You can't see how you are doing on these anywhere and there is little to no information on how these numbers are actually generated.
There isn't a magic bullet for any of this as it's very dependent on your setup and your clients. I've personally never hit the download quota per day and hit the upload quota here and there.
@DJWESTY1985 What's your Internet speed? Like @Animosity022 said, I've been using his settings for a long time now, and I have zero issues streaming even the largest UHD files. This is with a 150/150 connection, so certainly not the fastest.
I have a very large Plex library, which I scan once a day. I don't believe the scraping has ever resulted in an API ban.
Buffering, especially with large files, can easily be the result of latency, as I experienced not too long ago, when my ISP had major peering issues with Google services. For a couple of weeks, I was not able to play anything 4K, and even some of the HD stuff wouldn't play.