My Recommended Google Drive and Plex Mount Settings - Updated 6-Mar-2019


Here is my Systemd service file.
I have not been able to spot what cause the last scan. I guess it is related to a deep scan from either plex, radarr or sonarr. I have no other service on my server.

sorg@asteroid:~$ cat /etc/systemd/system/gmedia-rclone.service
Description=RClone Service

ExecStart=/usr/bin/rclone mount CryptSansCache: /media/GD \
   --allow-other \
   --fast-list \
   --buffer-size 256M \
   --dir-cache-time 72h \
   --drive-chunk-size 32M \
   --log-level DEBUG \
   --syslog \
   --timeout 1h \
   --umask 002 \
   --vfs-read-chunk-size 128M \
   --vfs-read-chunk-size-limit off \
ExecStop=/bin/fusermount -uz /media/GD



I have analyze off in Sonarr/Radarr and deep analysis off in plex.



For your amazing openvpn config, is there a way to achieve the same thing using ufw and iptables?

I could disable the WAN access with ufw, but if install your configs, it connects and everything seems to be okay however the user still have no connection.

I commented out these lines:

since i don’t want to drop all my other ufw rules. Also had to comment out this line since I don’t have a local network.

There I put my WAN ip since that’s the only one I have.

And here my GATEWAYIP is… Is that okay?

Thank you



I’m not as familiar with UFW but I would assume it’s possible.

The use case of the VPN config is:

  • have a user called ‘vpn’
  • have that user route all traffic out the VPN interface only

I don’t think it matters much if you have a local network or not as I just have a few local rules for blocking my LAN stuff and the Plex server itself from taking connections on 32400.

I had that one line to allow my box to talk to my router that you commented out but it can removed as you noted.

I’m not sure though if UFW allows for user specific rules or not.



ufw builds iptables rules and it’s just an interface to make the config easier. Plus that’s the only supported firewall by some services I use.

I tried to copy everything you had, however I still have no internet on the vpn user… :thinking:



In my searching, I don’t think you can do a per user rule in UFW, which is why I recall having to use iptables instead.



I kept this file as it is with some modifications (see #562):

And ufw just adds a bunch of rules however it still doesn’t seem to work.



Yeah as I’m saying, mixing in UFW won’t work.

My setup relies on iptables and being able to save and reload them.



UFW is just an interface to modify iptables, so it should be possible. It only extracts the already existing rules on startup and add its own ones. But those own ones are being translated into iptables commands therefore the result is the same. It’s just an easier way of managing common rules.

However for some reason it still not works. I am now wondering what can be the issue.

Would you mind executing this? echo $(ifconfig tun0 | egrep -o '([0-9]{1,3}\.){3}[0-9]{1,3}' | egrep -v '255|(127\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3})' | tail -n1)

took it from the It outputs for me.



Oh yes, it worked! :tada:

Was my bad since I chose to drop the packages in a different way firstly.



Depending on your OS, ifconfig might not be there and need to be installed.



I’m running Plex on Ubuntu 16.04.5 LTS in a VM running under VMware Fusion on an i7 Hackintosh running High Sierra. I’m mounting my Google Drive using rclone.

After reading through this thread & other similar ones I just did some tweaking to the rclone mount parameters which has improved performance enormously. Previously it could take 20-30 seconds for a file to start playing with both Plex & Emby. With my new mount command it now takes 3-5 seconds. I’m not sure if all the parameters are necessary or whether there are any others that would improve performance even more but there has been such a dramatic improvement that I thought it worth sharing so here is the rclone.service script that I am now using:-

Description=Mount and cache Google drive to /mnt/Gdrive
ExecStartPre=-/bin/mkdir /mnt/Gdrive

ExecStart=/usr/sbin/rclone mount
GD-GSuite:PLEX /mnt/Gdrive
–config /home/nigel/.rclone.conf
–umask 002
–dir-cache-time 96h
–drive-chunk-size 32M
–buffer-size 256M
–timeout 1h
–vfs-read-chunk-size-limit off

ExecStop=/bin/fusermount -u -z /mnt/Gdrive
ExecStop=/bin/rmdir /mnt/Gdrive



That seems almost identical to what I use.

Do you have your client ID/API key setup?



I think that I probably copied most of the code from your rclone.service on GitHub.

Previously I had amongst other parameters


So clearly buffer-size & vfs-read-chunk-size are now much larger.

I’m running Plex in an Ubuntu VM under VMware Fusion on an i7 Hackintosh running High Sierra. 4GB of RAM is allocated to the VM out of the total 16GB of RAM available to the system. I don’t really use the system for anything other than Plex & occasionally Emby. It’s only serving to local clients either an AppleTV 4K or an Amazon FireTV 4K but only ever one client at a time.

I’m really pleased with this performance breakthrough as while I’ve always been pretty happy with mounting Google Drive on my local system but found irksome the 20-30 second delay before a file would start playing especially as it was often accompanied by freezing & stuttering for another 20-30 seconds before the file would then carry on playing smoothly. Basically I must have had it misconfigured all along & now it’s performing mucg closer to optimum but I suspect that I could allocate more RAM to the VM & tweak the mount command to allocate a lot more buffers or cache or whatever to rclone & improve performance even more. Does anyone have any suggestions?



No I don’t use my own key. I must get round to it sometime.



I would recommend that asap as that’s the biggest improvement performance you can give yourself.



I just looked at the rclone docs page describing how to set this up & it looks like Google as usual have been changing their webpages & interface so what I see bears little or no relation to what is described in the rclone docs so I cannot figure out how to create my own ID for rclone.

Can anyone provide a step by step guide to the current method of setting this up?



Can you confirm these look to be good @nigelbb and I’ll send a pull request to update that part of the page.

1. Log into the [Google API
Console]( with your Google
account. It doesn't matter what Google account you use. (It need not
be the same account as the Google Drive you want to access)

2. Select a project or create a new project.

3. Under "ENABLE APIS AND SERVICES" search for "Drive", and enable the
then "Google Drive API".

4. Click "APIs & Service -> Credentials" in the left-side panel.

5. Click on "Create Credentials and select "OAuth client ID".  It will prompt you to set the OAuth consent screen product name, if you haven't set one already.

5. Choose an application type of "Other", and click "Create" using any Name as "rclone" would be fine.

6. It will show you a client ID and client secret.  Use these values
in rclone config to add a new remote or edit an existing remote.
1 Like


Sorry I just walked through this & almost none of your instructions are correct (at least for my account when I log in). It’s kind of similar but different enough that your instructions are impossible to follow although I think that I have succeeded purely by accident & would hate to have to try & repeat the exercise.

The word project does not appear anywhere on the page.

There is however a tab “ENABLE APIS AND SERVICES” & if I click on that I can enable the “Google Drive API”

“APIs & Service -> Credentials” is not present in the left-side panel.

If I click on a blue button labelled MANAGE I then find “Credentials” as the last item in the list in the left-side panel.

If I click on “Credentials” I get

A button labelled “+ CREATE CREDENTIAL”
"Credentials compatible with this API

To view all credentials or create new credentials visit [Credentials in APIs & Services]"

I did luck out & eventually arrived at a screen where I found displayed my Client ID & Client secret but have no idea how I got there or any way of retracing my steps to find it again. Also it displayed this message along with ID & secret:-

OAuth is limited to 100 [sensitive scope logins]until the [OAuth consent screen]is published. This may require a verification process that can take several days.

Why do Google make it so mind-blowingly complex to access their applications? The UI is just awful & why do they keep changing it so when you want to repeat the operation you did last month you can’t because the UI changed. I am an experienced developer & system administrator but this stuff gets me tied in knots.

BTW thanks for your help. My complaints are with Google not you:grinning:



So did you click on the very first link?

If you have a project already it auto selects that and you get this screen:

That’s step 1 and maybe step 2. You get the screen above if you already have a project created.

Step 3 is the box at top that says ENABLE APIS and SERVICES on the screenshot.

Step 4 brings you to the API Library and you pick google drive from there and enable it:

Once enabled, you hit the left side menu to bring it up and get this screen:

and you get this screen for Step 5/6

Can you please share some screenshots of what yours look like as I can follow these verbatim?

1 Like