This should, from what I`m hearing, absolutely avoid bans when doing a Plex scan.
Grep to see if there are any folders that it couldnāt parse. Do you have any 0 length folders or files?
The code indicates that it seems like it wonāt parse things that donāt have a content length.
I would grep to see if you have any other log messages along those lines. I bet its garbage files honestly. How organized and clean are your files? I mean 30 TB? Thats a lotā¦haha
You could alter the code to and log the output of items_to_parse_from_google to see what it is that is not parsing.
Hmm, Iām quite rusty at javascript and little experience with node.
Iām mounting my Google Drive, but the Media folder is encrypted with rclone crypt, not sure if itās balking at that.
For example, I recently added Planet Earth II, the documentary and it isnāt showing at all when I mount the folder. When I mount via rclone mount, it shows fine (and plays fine).
Also, every time I do a node client.es6.js
to rebuild the index, I get a slightly different number of files/folders that are not parseable, which makes me think itās an API limitation, timeout, lack of retries, or something. One time I"ll get 165 folders, if I re-run immediately itās 164ā¦
Very odd.
Yeahā¦I checked your issue to see if I had the same problem. I blew away my database and set everything up on a testing machine. I got 1 file that was not yet parseable. It was a subtitle (.srt) file. But I checked back and found that a few seconds later it had parsed and shown up. So it might be that you are being rate limited. You can tail -f the log file and see if new files are being picked up every 60 seconds. I suspect cause your library is so large that it might take a bit for the initial scan to be completed. Just my guess.
I decided not to do a encrypted volume on Google drive. I know that google encrypts the contents of drive automatically. Soā¦they say, that they cant read them. We know that amazon does NOT do this so I did have an encrypted mount when I was using ACD. Also, I have plex pass, and its nice to use Plex Cloud. If I get a DMCA takedown by googleā¦meh, Iāll just use ACD until I replicate back to gdrive encrypted.
You can just add this line after line 298 on client.es6.js
`logger.debug(__items_to_parse_from_google__);`
Then cat your log file, you will have tons to seeā¦hahaha.
I may go that route, but hereās a question
For the initial scan, it tells me it canāt find 3000+ files and 165 folders, so thereās probably lots of stuff missing.
Now, if Iāve missed the initial scan, what are the chances it will find them later on? MY understanding of the process is the initial scan (node client.es6.js
) is used to build the initial database and then it just looks for āchangesā from the API.
If these are not really changes and should have been picked up in the initial scan, these folders wonāt necessary change, so I got the impression they wouldnāt appear in the changes API.
Not sure if that makes sense.
Also, I did edit the file to add the output of the notFound
variable and did get a lot of data. Unfortunately I canāt match any of the names with my rclone lsl crypt: --crypt-show-mapping
and rclone lsd crypt: --crypt-show-mapping
output.
Huhā¦interesting. Remember, node-gdrive-fuse mounts to the root of your google drive account where rclone gives you the option to mount within a folder off the root. Is your plex stuff off the root? is it possible those files are outside your rclone root? make sense?
Did your Planet Earth stuff eventually show up? Iāve never seen a problem of missing files from my end. So I wonder if the rclone crypt is the problem.
So hereās my setup so weāre on the same page:
~/.media/gd-encrypted
: raw Gdrive mount (via node-gdrive-fuse)
~/.media/gd-decrypted
: rclone local crypt mount which points to above folder with Media subdir (where encrypted content is) - ~/.media/gd-encrypted/Media
When I do it this way, there are folders I can tell right away donāt appear in the gd-decrypted
folder, specifically that Planet Earth one and some others (I have a Plex plugin / channel thatās called FindMissing.bundle and it will find files that are missing, so this makes it easy).
If I mount the crypt directly with rclone (instead of the way I mentioned above), the folders are definitely there (and I can play from them)
So itās weird.
Looks like thereās a lot of talk about node-gdrive-fuse in here. The biggest downside is the fact that it is no longer being developed. My main Plex server uses node-gdrive-fuse with a script that runs every 30 seconds and checks the mount status. I last tweaked around with it about two weeks and Iām happy to report that the setup is completely hands off. I canāt even remember the last time someone bugged me about any media being unavailable.
Right now the only thing Iām using rclone for is uploading media, havenāt used ACD in quite awhile because I wasnāt happy with how seeking/startup time worked. Google Drive just seems to be a superior service and thatās no fault of rclones, node-gdrive-fuse works because itās caches the directory.
Also, for reference sake my library is just about 70tb. If you have issues with node-gdrive-fuse you need to clear the cache and try again.
Iāve setup a couple download servers over the last week and it takes me right around 30 minutes to get Sonarr/Radarr/NZBGet/node-gdrive-fuse etc setup and going without a hitch.
Thanks for sharing your setup. This is the way Iām going. Just doing some preliminary tests with my test GDrive account (unencrypted), playback has been rock solid with node-gdrive-fuse.
Playback is consistent and always starts within 5-10 seconds. Fast forward/rewind work so much better than ACD.
I think what Iām going to do is have a primary and backup GDrive with unencrypted content and keep a backup of my media on ACD encrypted (thatās regularly synced). That way, if thereās ever a DMCA or my accounts get shutdown, I can just copy things back from the encrypted ACD.
This is what I do, I have a throwaway gmail account with unlimited storage that is used for downloading and primary playback. Itās completely unencrypted and I think itās the way to go as far as performance goes. I have a personal edu account that I use as an encrypted backup, alongside a secondary (throwaway) gmail account that houses a copy of the unencrypted media as well as the encrypted media. (Both primary enrypted and unencrypted account share to this throwaway so I can utilize server-side copies.) My personal edu account copies the encrypted data over to ACD,
So in short I utilize 4 different accounts and have 2 copies of my data unencrypted and 3 copies of the encrypted data.
Unencrypted Google Drive with an SSD is a dream though.
I did quite some testing with encrypted vs unencrypted ( also have gdrive unecrypted one ) and preformance wise it was on pair, no difference at all. ( beside rcloneās CPU usage when using crypt account )
What was the CPU usage like on encrypted vs not?
Around 5% on my server, but since last testing ncw did quite some optimization i see much better ram and cpu usage since last few betaās.
p.s. My unecrypted gdrive is in use now with Plex Cloud.
Oh, gotcha. I doubt Iāll change my setup for the time being I just remember when I first started using rclone the crypted mount seemed to use more resources than using even encfs so I opted to just go without it so long as Iāve got some encrypted backups I can fall back on. I may have to do some testing and see if any of my users complain
Plex Cloud via Google unencrypted has actually gone a long way. Things run pretty smooth now from my testing.
If rclone supported a Google implementation similar to node-gdrive-fuse, Iād be more inclined, but unless youāre building crazy scripts to update your library and have to do a special dance just to avoid being banned, itās not really something I want to get involved with.
My ideal world would be to use Google Drive encrypted with rclone and Plex without the fear of being banned. Since thatās not really feasible now unless youāre willing to put in a bunch of hack job scripts (which you have to support, maintain, ensure they dontā break, etc.) then youāre pretty much forced to use Google + node-gdrive-fuse, unencrypted (or encrypted). This is really the fastest way to vanilla Plex + Google
I completely agree! Its like have unlimited local storage. The performance is unbelievable.
Yea it working great for me too, but never tested with tons of concurrent streams.