Google Drive with Plex and local read-only cache

Google Drive with Plex and local read-only cache.

I was looking at this topic and the result is great, but I would like to know if it is possible to enable a local SSD cache for this approach.

this Google Drive, Plex, Windows 10

I liked the settings friend, but how do I enable a cache on my SSD?

What is your rclone version (output from rclone version)

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Which cloud storage system are you using? (eg Google Drive)

hello and welcome to the forum,

you can use add this to your command

https://rclone.org/commands/rclone_mount/
--cache-dir string Directory rclone will use for caching.

for exampe,if the ssd is z: then this would work.
--cache-dir=z:\cache

note: you can change the directory to whatever location you need

Hi,
I put my script like this, will it work?

first script.

@echo off
title Rclone Mount READ ONLY
rclone mount --attr-timeout 1000h --dir-cache-time 1000h --poll-interval 0 --rc --read-only -v drive-tiago: M: --cache-dir=c:\rclone_cache

second script

@echo off
title Rclone Prime
rclone rc vfs/refresh recursive=true --timeout 10m

looks good,

about the second script, for testing, i would add a line at the end, so you can see the output
pause

perfect friend, thank you very much i will see the performance as it is.

I am purchasing an Nvideia Shield I still don't know how I'm going to use the rclone with it, do you know if it's a good option to use plex / kodi on the shield?

you might want to add this to the mount,
@VBB does not but most users including myself do use it.
https://rclone.org/commands/rclone_mount/#vfs-cache-mode-full

1 Like

would it look like that? or do I need to take a parameter?
@echo off
title Rclone Mount READ ONLY
rclone mount --attr-timeout 1000h --dir-cache-time 1000h --poll-interval 0 --rc --read-only -v drive-tiago: M: --vfs-cache-mode full --cache-dir=c:\rclone_cache

yes, that looks good,
let me know if it works for you?

yes, it looks good, he created files as I use them in plex.

my second script return this.

C:\Windows\system32>cd C:\rclone
{
        "result": {
                "": "list: failed to resolve shortcut: googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=290374489880, userRateLimitExceeded"
        }
}

i do not use gdrive much, but gdrive has a lot of rate limting.

if you add -vv to the command, you will get debug output for more detail.

i would login to the link provided, as take a look at what exactly is the problem

keep in mind, that a plex scan can use a download a tremendous amount of data.
one of our experts on gdrive hasve posted this guide.
https://github.com/animosity22/homescripts#reducing-api-usage--save-download-quota

I will apply these things to my collection to avoid these problems.

Sorry for the amount of questions friend, what other drives do you recommend besides the google drive?

hard to recommend without knowing what your use-case is?

gdrive is by far the most used backend here in the forum.

i use rclone mostly for backups for valuable data, not media streaming.
wasabi for hot storage, aws s3 glacier and deep glacier.

for streaming, i have a vps in the cloud and a vpn for rclone to mount from it.

I understood friend, so I have been using google with plex for over a year and I have no problems, I was using raidrive and I never had any problems, but I am wanting to learn how to use the rclone which has a wide range of configurations.

rclone can be overwhelming at first, but you seem to have gotten the hang of it in just one post....

yes friend, i believe i got how it works, thankfully, i need to optimize something to restart my script every given time, i had to manually restart to get the changes i made, usually i update my collection every 6 hours. or 4 out of 4, I need to see how to do that.

you can write a batch script. run it using task scheduler.

  1. prime the cache
  2. plex scan

I'm going to prepare the cache would be exactly that, correct?

@echo off
title Rclone Prime
rclone rc vfs/refresh recursive=true --timeout 10m
pause

yes, we call it priming the cache....