I’m currently working on a standalone google fuse solution. The googleapi ist quite capable of delivering all the stats of all files in a directory in only on request. So scanning currently just takes one request per library. By watching the “changes” endpoint of the api is quite easy to flush the correct caches.
Also my solution saves all created files in a separate local folder and automatically uploads files with a mtime older than 1h to google, flushes the internal caches and deletes the local file after a successful upload. So this should be a perfect for the plex optimizer.
If its not against any board rules I’ll post a github link here when I think the app is ready for a beta test.
But it seems that I don’t totally understand the googleapi restrictions. Yesterday I managed to get my drive locked, but the api console shows a total of approx 950 api request for the whole day. So the 1000 requests / 100 seconds limit cannot be the problem.
Currently I need one request per directory for names, size, mimetype ect. and one request per file to start a stream and consume the data.
@ncw Maybe you have more information for this problem?
I am currently using Plex with rclone and google drive on Synology DSM 6 with no issues whatsoever and my library has more than 6000 (that number increases 100 files every day/2 days).
When I first started playing with rclone I was getting disconnected very often, what I did instead of asking rclone to list and remount I created a task to run every 5 minutes (ls /foldermounted) and changed the way Plex updates the library to check only changes in folders with new files. That’s been working for 10 days now and 0 issues, no bans nor connection problems.
these are my current commands to mount
rclone mount --allow-non-empty --allow-other --read-only remote:/ /folderNAS/ &
I think to be big problem with google drive is to scan a library from scratch. Once you have it up and running, plex will only scan the new files.
As far as I understand it, during a scan Plex opens every file at 2 - 4 different locations for probing and I don’t see a way of avoiding or caching this. Also throttling api request didn’t help me either (one request every 250ms).
Next time my drive unlocks I’ll try to monitor the total consumed data of a plex scan and look if there might be a daily download limit.
Or has anyone any clues?
this wont work on mounted drives as its expect ion OS to send notification and that one is working only on local disks.
Atm Iam updating via cli only library i uploaded stuff eg movies if i uploaded new movie, but iam thinking about extending script in a way to update only the newly added folders.
Waiting for my new server since on current one I dont have nginx and sincc Iam using diffrened server to upload stuff need to make some kind of webcall for it, ( plex default web service does not support passing folders just sections updates )
Actually this does work if you’re using a UnionFS to join the local media folder and a mounted gdrive/acd together.
New videos are put in the local folder and this triggers the OS notifications. This is how I’ve been doing it for a few months. It’s great because Plex detects changes immediately so new media shows up almost instantaneously.
This of course only works if you’re download and Plex server are the same.
That’s interesting, I checked and you are right, it does scan the whole library.
I have followed one of your cronjobs to remount and it works great
Now I would like to create a task to update Plex library via CLI only scanning new added files te reduce the amount of requests, How should I do it? My Plex server is mounted on Synology if that helps.