Google Drive vs. ACD for Plex


#128

I already have the “official” one, the disposable ones are one time pay of 10 -15€ eg gsuites.org since i know few people using it for months.

Those disposable ones are coming from EDU ( and some organizations) accounts as they dont pay anything monthly for it.


#129

Interesting. How do you deal with Plex’s friends/invite system using a proxy? Curious as to how you made that work.


#130

I used this as base config https://github.com/toomuchio/plex-nginx-reverseproxy/blob/master/nginx.conf


#131

@Ajki
I used something similar to that with CloudFare, I would occasionally run into problems where my server was no longer accessible remotely, you ever run into anything like that?

Seems like blocking 32400 and running Plex exclusively on 443 was causing me issues, I’d either have to restart nginx or just ‘Enable Remote Access’ then disable it within Plex to get everything working normally again.


#132

Hello everyone, I am the owner of gsuites.org.

if you have any questions with the service that we provide of google drive unlimited please don’t hesitate to ask. :slight_smile:

our website it’s www.gsuites.org you can get our directory contact info there .


#133

What do you do about the download cache directory? My understanding is that there is a bug where it wont delete outdated cache chunks when delete_files() is called. It takes care of the database, but files still stay in the directory which could fill up your disk. Do you have this problem?

See this post as a reference:


#134

The mod time is fixed apparently.

The cache not being deleted is a problem, a simple cron job to remove data from the cache that is older than a few hours should fix it, pretty simple

The other issue mentioned I haven’t seen… yet.


#135

Thanks for great service


#136

@jmacul2

Thanks for this post, I hadn’t seen that specific information before but it’s nice to see that some of the issues I’ve run into are shared by others, misery loves company and all that.

As for the issues listed, I’ve never had problems with mod time at all. I didn’t do anything specific to prevent it, either. I’m on Ubuntu 16.04 but also have a server running 14.04 and haven’t had problems there either.

I’ve also never had any issues with the cache filling up and not being deleted. I have two servers, one running 16.04 and is my main Plex server. It has a 450GB SSD for storage and that is it. I have downloading setup on it, but it’s only purpose is to download currently airing TV Shows so they’re available immediately. When it’s imported via Sonarr, it will auto-update my Plex libraries, and then run a script that checks disk usage. If disk usage is at 75% or greater, downloading is paused and and media is uploaded to GDrive with a conservative bwlimit. The goal of my primary Plex server is Plex performance above all else, so I really want to limit the downloading on it not only because of bandwidth concerns, but I also want to limit the amount of disk I/O seeing as I only have on HD. My secondary server deals with my backlog of movies that need to be added to Plex, as well as downloading new series. The only downside to this is I have to go in and manually add continuing series to my Plex server once I’m ready to download individual episodes as they air.

However, after seeing the link you posted I’m going to have to move the cache over to RAM, that’s actually a pretty awesome idea and should assist limiting disk I/O even more, really appreciate the link even though you didn’t mean to inspire me, haha.

Honestly the only issue I’ve had is with automatic download handling with Sonarr/Radarr which I see is mentioned in that post as well.

I noticed that Sonarr would do perfectly fine importing episodes if the folder structure didn’t already exist. So new series were fine, and I end up keeping empty folders for my TV Shows as I have no real need to clean them up. It’s only when the folder structure doesn’t exist locally and only through node-gdrive-fuse would you run into a problem. So all I did was copy my drive’s folder structure and mirrored it to my download directory. Once mirrored you can just fuse them together and everything should just work (so far I haven’t been doing this long).

As for Radarr, I’ve found ignoring deleted files and just setting Radarr’s import directory to my dl directory (as opposed to a union mount of Google Drive and the dl directory like I do for TV shows) works fine for now. I recently started re-downloading my movie library so it’s grabbing everything and immediately uploading it. This is pretty low priority because I already have the majority of the stuff I’m downloading, just in a lower quality, so once my new library is at a point where I can make it accessible to Plex I’ll probably just copy the setup I have with Sonarr.

Anyway, apologies for the wall of text. Long story short is that I acknowledge the issues node-gdrive-fuse has but so far it’s worked the best for my needs. I’d prefer to use rclone since it’s an actively developed project, comes with great documentation, and is hugely useful outside of just acting as a mount; but until there’s some sort of database or measure in place to prevent the bans it’s just not feasible for my current setup sadly.

Edit: I forgot to mention, my cache size is set really, really small. I think it’s at 20mb or so. The default is 700+gb I’m pretty sure, which cause me a few headaches when I first set it up since my HD is about half that size.


#137

is there an easy way to sync ACD to Gdrive using rclone (i.e a direct transfer between ACD and GDrive without having to re-download everything to my server first?)


#138

No using rclone you can check out multicloud cloudhq offcloud but remote to different remote requires redownload.

I’d suggest a cheap unlimited vps and sync that way


#139

I’m using digital ocean at 6 cents per hour. It’s running at 56 MB/s to do a acdCrypt to GCrypt direct. Once it’s done moving my 2 TB I will just delete the droplet.

rclone copy acdcrypt: gcrypt:

That’s it. I have a limit on there of 4 items just can’t remember the code for that

Edit*
14 hours later and 42 cents spent I have uploaded all my 1.6 TB ACD to Gdrive


#140

you said gsuites.org has one time pay accounts. Checked out their site and it only says 9e/month ones?
Looks like you have to go to ebay


#141

Gsuites was offering accounts on eBay, not sure if they are anymore


#142

Yup. Digital Ocean is a cheap way.


#143

I just bought one off eBay from them.


#144

Yep. Just bought one :slight_smile:


#145

Can’t seem to find them on Google. Anyone able to provide a link please.


#146

http://m.ebay.ca/itm/UNLIMITED-Google-Drive-LIFETIME-cloud-storage-account-from-100-super-admin-/142291146174?nav=SEARCH


#147

http://www.ebay.com/itm/UNLIMITED-Google-Drive-LIFETIME-cloud-storage-account-from-100-super-admin-/142291146174
Bought it few hours ago tho and it gave link where to submit your information. Havent gotten my username or password yet tho