Thanks for this post, I hadn’t seen that specific information before but it’s nice to see that some of the issues I’ve run into are shared by others, misery loves company and all that.
As for the issues listed, I’ve never had problems with mod time at all. I didn’t do anything specific to prevent it, either. I’m on Ubuntu 16.04 but also have a server running 14.04 and haven’t had problems there either.
I’ve also never had any issues with the cache filling up and not being deleted. I have two servers, one running 16.04 and is my main Plex server. It has a 450GB SSD for storage and that is it. I have downloading setup on it, but it’s only purpose is to download currently airing TV Shows so they’re available immediately. When it’s imported via Sonarr, it will auto-update my Plex libraries, and then run a script that checks disk usage. If disk usage is at 75% or greater, downloading is paused and and media is uploaded to GDrive with a conservative bwlimit. The goal of my primary Plex server is Plex performance above all else, so I really want to limit the downloading on it not only because of bandwidth concerns, but I also want to limit the amount of disk I/O seeing as I only have on HD. My secondary server deals with my backlog of movies that need to be added to Plex, as well as downloading new series. The only downside to this is I have to go in and manually add continuing series to my Plex server once I’m ready to download individual episodes as they air.
However, after seeing the link you posted I’m going to have to move the cache over to RAM, that’s actually a pretty awesome idea and should assist limiting disk I/O even more, really appreciate the link even though you didn’t mean to inspire me, haha.
Honestly the only issue I’ve had is with automatic download handling with Sonarr/Radarr which I see is mentioned in that post as well.
I noticed that Sonarr would do perfectly fine importing episodes if the folder structure didn’t already exist. So new series were fine, and I end up keeping empty folders for my TV Shows as I have no real need to clean them up. It’s only when the folder structure doesn’t exist locally and only through node-gdrive-fuse would you run into a problem. So all I did was copy my drive’s folder structure and mirrored it to my download directory. Once mirrored you can just fuse them together and everything should just work (so far I haven’t been doing this long).
As for Radarr, I’ve found ignoring deleted files and just setting Radarr’s import directory to my dl directory (as opposed to a union mount of Google Drive and the dl directory like I do for TV shows) works fine for now. I recently started re-downloading my movie library so it’s grabbing everything and immediately uploading it. This is pretty low priority because I already have the majority of the stuff I’m downloading, just in a lower quality, so once my new library is at a point where I can make it accessible to Plex I’ll probably just copy the setup I have with Sonarr.
Anyway, apologies for the wall of text. Long story short is that I acknowledge the issues node-gdrive-fuse has but so far it’s worked the best for my needs. I’d prefer to use rclone since it’s an actively developed project, comes with great documentation, and is hugely useful outside of just acting as a mount; but until there’s some sort of database or measure in place to prevent the bans it’s just not feasible for my current setup sadly.
Edit: I forgot to mention, my cache size is set really, really small. I think it’s at 20mb or so. The default is 700+gb I’m pretty sure, which cause me a few headaches when I first set it up since my HD is about half that size.