Google Drive with Plex and local read-only cache

C:\rclone>cd C:\rclone
{
"result": {
"": "OK"
}
}
now Priming worked perfectly. how many times a day is it possible to safely execute this command?

Up to you. I do it once, sometimes twice a day. That's usually all I need, as I only do one Plex scan.

Do you manually scan the plex? I saw that you can update a collection via the command line on windows.

Yes, I prefer to do everything manually.

as i mentioned up above, you can write a simple script to

  1. prime the cache.
  2. plex scan

I understood I did some automations.

Ombi for requests, then go to sonnar and radarr, after that I play on another server that encodes the media for an encode configuration made by me, after that this server sends it to gdrive, this process happens during the day and the night 24/7 and after the media is found in the plex I have a python script that sends the plex ties to a group on whatsapp.

each activity is done by a device in my network, no process is done on the plex server, on the plex server. there is now the rclone plex and tautulli.

yes, i will follow this path, now that it seems that i will have no errors in the api, i hope everything is going well.

Oh, for sure everything can be automated. Would save me hours every day, but I'm a masochist :stuck_out_tongue:. I don't do torrents, never have. Don't really have a need for Ombi either. In fact, I don't use any add-ons with Plex. I'm more of a vanilla kind of guy :wink:

laughs, I understand, yes automation helps a lot, I like it. now I just need to get used to these new procedures with the rclone. the response time with these settings is great, time to start watching a movie is very fast.

hmmm,
a masochist that is a vanilla kind of guy

You make it sound all kinky!

yeah, automation with python is the way to go.

i have a 400+ line python script that automates VSS - volume shadow copy, rclone, fastcopy and 7zip and then scan the logs files for errors.

image

1 Like

That's also my tag line on LinkedIn :smiley:

I don’t do much, but I’m boring and I’m trying to get something. even if it's a headache. lol :rofl:

friends, a question, my films are at a level of 4 levels of folders inside the drive that influences something.
(slow reading of files).
example:
M:_Minha.Biblioteca\Biblioteca\Filmes\timizados
M:_Minha.Biblioteca\Biblioteca\Filmes\timizadosLeg

Makes no difference. What does is how many files and/or folders you have inside the same folder. For example, I have a folder called Movies, which has close to 12,000 subfolders. That slows things down a lot when scanning with Plex. Better to have \Movies\A\subfolders, \Movies\B\subfolders, and so on, but I'm to lazy to reorganize :slight_smile:

I got it, I saw something good that has a maximum of 10 subfolders, I'm thinking of dividing the folders too.
/movie/A/subfolders
/movie/B/subfolders

It's a good idea.

I modified my Priming. for him to stay just where I want to have an updated list on that server.

rclone rc vfs/refresh dir=_Minha.Biblioteca/Biblioteca/Filmes/Otimizados --timeout 30m
rclone rc vfs/refresh dir=_Minha.Biblioteca/Biblioteca/Filmes/OtimizadosLeg --timeout 30m
rclone rc vfs/refresh dir=_Minha.Biblioteca/Biblioteca/Series/Otimizadas --timeout 30m
rclone rc vfs/refresh dir=_Minha.Biblioteca/Biblioteca/Series/OtimizadasLege --timeout 30m

C:\Windows\system32>cd C:\rclone
{
        "result": {
                "_Minha.Biblioteca/Biblioteca/Filmes/Otimizados": "OK"
        }
}
{
        "result": {
                "_Minha.Biblioteca/Biblioteca/Filmes/OtimizadosLeg": "OK"
        }
}
{
        "result": {
                "_Minha.Biblioteca/Biblioteca/Series/Otimizadas": "OK"
        }
}
{
        "result": {
                "_Minha.Biblioteca/Biblioteca/Series/OtimizadasLege": "OK"
        }
}
Pressione qualquer tecla para continuar. . .

(Insert funny and very witty Linux joke here where folders and scanning don't matter)

1 Like

don't even mention it, I'm stubborn with this Windows. :rofl: