Google Drive vs. ACD for Plex

If rclone supported a Google implementation similar to node-gdrive-fuse, I’d be more inclined, but unless you’re building crazy scripts to update your library and have to do a special dance just to avoid being banned, it’s not really something I want to get involved with.

My ideal world would be to use Google Drive encrypted with rclone and Plex without the fear of being banned. Since that’s not really feasible now unless you’re willing to put in a bunch of hack job scripts (which you have to support, maintain, ensure they dont’ break, etc.) then you’re pretty much forced to use Google + node-gdrive-fuse, unencrypted (or encrypted). This is really the fastest way to vanilla Plex + Google

I completely agree! Its like have unlimited local storage. The performance is unbelievable.

Yea it working great for me too, but never tested with tons of concurrent streams.

I think direct play shouldn’t be an issue. It was written somewhere that 3 transcode sessions was all they were going to support.

Seems like it wouldn’t be a super ideal solution for people with larger userbases then, right? I think something like Plex Cloud would be perfect for smaller libraries/single household use. Which is probably why Google limits the number of transcodes. Wouldn’t that be a thing, offer unlimited storage alongside unlimited transcode, what a site that would be…

I think the way Plex Cloud works is that Plex (the company) has hosted servers...most likely on EC2 that provide the computing (transcode) power. Then you hook in your cloud library to that. The 3 transcode limit was something they said they were going to implement when it comes out of beta. Currently right now there is no limit while its still beta.

With that said, there appears to be many Plex pass users who still haven't gotten an invite and there is wild speculation that Plex cloud is dead in the water.

What I think the future should be, is a way to hook into the transcoding engine that Google and Amazon already use for videos. When you upload videos to Google Drive, Google uses their youtube engine to process and convert your video into mp4 and flv formats and various resolutions behind the scene. This is why when you go into google drive and hit 'preview', you are able to play the video on your browser, and on your phone, etc. All of these formats are h264 and aac encoded.

I submitted a feature post on this forum where we are talking about trying to get rclone to parse these files and show them in the mount or provide a way to download them via a command.

I believe plex cloud is actually using AZURE. It uses the different cloud providers to make API specific calls to do what it needs to do. I doesn’t “mount”. During a direct play the client is actually talking directly to the cloud provider and not going through the plex cloud server to play the item.

There is a feature request to have that same logic be implemented on normal plex servers for users to run their own server yet talk to cloud providers via API directly.

The key thing in all this is the encryption which non of the above supports.

I use Plex Cloud as backup.
My main server is running trough Nginx and if my mount drops ( only when amazon ban me ) user is automatically redirected to plex cloud.

My sync scripts run Plex LIbrary scan after sync is complete so most of the content is there within 20 to 30 minutes after it was first available on main server trough ACD.

And yea I agree having unecrypted content is risk, I dont really mind 10€ disposable gdrive account Iam using for it. I keep my main library encrypted (crypt) on ACD and my bussiness gdrive and just sync unecrypted content for Plex Cloud usage to disposable one.

The only thing is that if disposable one gets removed I will need arround 6 days atm to sync my 46TB library.

p.s. Iam using gsuites.org and if you guys have another good provider eg not those scamers where account will work for a month then get removed, let me know. ( Considering syncing everything on additional gdrive disposable account from some other seller since once he gets banned all accounts will be deleted )

Setup a domain for yourself. Register it with google for their business plan of $10/month per user at https://gsuite.google.com/. Then start uploading. They technically have a limit of 1TB per user, but users over at /r/datahoarder are saying that Google doesn’t actually enforce that limit.

I already have the “official” one, the disposable ones are one time pay of 10 -15€ eg gsuites.org since i know few people using it for months.

Those disposable ones are coming from EDU ( and some organizations) accounts as they dont pay anything monthly for it.

Interesting. How do you deal with Plex’s friends/invite system using a proxy? Curious as to how you made that work.

I used this as base config https://github.com/toomuchio/plex-nginx-reverseproxy/blob/master/nginx.conf

@Ajki
I used something similar to that with CloudFare, I would occasionally run into problems where my server was no longer accessible remotely, you ever run into anything like that?

Seems like blocking 32400 and running Plex exclusively on 443 was causing me issues, I’d either have to restart nginx or just ‘Enable Remote Access’ then disable it within Plex to get everything working normally again.

Hello everyone, I am the owner of gsuites.org.

if you have any questions with the service that we provide of google drive unlimited please don’t hesitate to ask. :slight_smile:

our website it’s www.gsuites.org you can get our directory contact info there .

What do you do about the download cache directory? My understanding is that there is a bug where it wont delete outdated cache chunks when delete_files() is called. It takes care of the database, but files still stay in the directory which could fill up your disk. Do you have this problem?

See this post as a reference:

The mod time is fixed apparently.

The cache not being deleted is a problem, a simple cron job to remove data from the cache that is older than a few hours should fix it, pretty simple

The other issue mentioned I haven’t seen… yet.

Thanks for great service

@jmacul2

Thanks for this post, I hadn’t seen that specific information before but it’s nice to see that some of the issues I’ve run into are shared by others, misery loves company and all that.

As for the issues listed, I’ve never had problems with mod time at all. I didn’t do anything specific to prevent it, either. I’m on Ubuntu 16.04 but also have a server running 14.04 and haven’t had problems there either.

I’ve also never had any issues with the cache filling up and not being deleted. I have two servers, one running 16.04 and is my main Plex server. It has a 450GB SSD for storage and that is it. I have downloading setup on it, but it’s only purpose is to download currently airing TV Shows so they’re available immediately. When it’s imported via Sonarr, it will auto-update my Plex libraries, and then run a script that checks disk usage. If disk usage is at 75% or greater, downloading is paused and and media is uploaded to GDrive with a conservative bwlimit. The goal of my primary Plex server is Plex performance above all else, so I really want to limit the downloading on it not only because of bandwidth concerns, but I also want to limit the amount of disk I/O seeing as I only have on HD. My secondary server deals with my backlog of movies that need to be added to Plex, as well as downloading new series. The only downside to this is I have to go in and manually add continuing series to my Plex server once I’m ready to download individual episodes as they air.

However, after seeing the link you posted I’m going to have to move the cache over to RAM, that’s actually a pretty awesome idea and should assist limiting disk I/O even more, really appreciate the link even though you didn’t mean to inspire me, haha.

Honestly the only issue I’ve had is with automatic download handling with Sonarr/Radarr which I see is mentioned in that post as well.

I noticed that Sonarr would do perfectly fine importing episodes if the folder structure didn’t already exist. So new series were fine, and I end up keeping empty folders for my TV Shows as I have no real need to clean them up. It’s only when the folder structure doesn’t exist locally and only through node-gdrive-fuse would you run into a problem. So all I did was copy my drive’s folder structure and mirrored it to my download directory. Once mirrored you can just fuse them together and everything should just work (so far I haven’t been doing this long).

As for Radarr, I’ve found ignoring deleted files and just setting Radarr’s import directory to my dl directory (as opposed to a union mount of Google Drive and the dl directory like I do for TV shows) works fine for now. I recently started re-downloading my movie library so it’s grabbing everything and immediately uploading it. This is pretty low priority because I already have the majority of the stuff I’m downloading, just in a lower quality, so once my new library is at a point where I can make it accessible to Plex I’ll probably just copy the setup I have with Sonarr.

Anyway, apologies for the wall of text. Long story short is that I acknowledge the issues node-gdrive-fuse has but so far it’s worked the best for my needs. I’d prefer to use rclone since it’s an actively developed project, comes with great documentation, and is hugely useful outside of just acting as a mount; but until there’s some sort of database or measure in place to prevent the bans it’s just not feasible for my current setup sadly.

Edit: I forgot to mention, my cache size is set really, really small. I think it’s at 20mb or so. The default is 700+gb I’m pretty sure, which cause me a few headaches when I first set it up since my HD is about half that size.

is there an easy way to sync ACD to Gdrive using rclone (i.e a direct transfer between ACD and GDrive without having to re-download everything to my server first?)

No using rclone you can check out multicloud cloudhq offcloud but remote to different remote requires redownload.

I’d suggest a cheap unlimited vps and sync that way