Buffering / Pausing when streaming from Plex off Gcache/Gcrypt

I have a setup that I am almost comfortable with (encrypted gdrive > cache > crypt) however when playing contents back.

When playing contents back, I am experiencing numerous buffering and playback issues.

System Information

  • CPU: AMD Ryzen 7 1700 (4CPU allocated)
  • RAM: 4GB RAM (Can increase if necessary)
  • DISK: 112 GB of SSD (37G used, 71G free)
  • OS: Ubuntu 18.04.2
  • NET: 1GB Down, 35MB Up - Comcast Xfinity

When I look at the log file with debug turned on I see the following:

2019/04/30 18:41:33 ERROR : TV Shows/MasterClass- Gordon Ramsay Teaches Cooking/Season 1/MasterClass- Gordon Ramsay Teaches Cooking - S01E10 - Mastering Ingredients- Fish & Shellfish.mp4: ReadFileHandle.Read error: unexpected EOF
2019/04/30 18:41:33 DEBUG : &{TV Shows/MasterClass- Gordon Ramsay Teaches Cooking/Season 1/MasterClass- Gordon Ramsay Teaches Cooking - S01E10 - Mastering Ingredients- Fish & Shellfish.mp4 (r)}: >Read: read=0, err=unexpected EOF
2019/04/30 18:41:33 DEBUG : &{TV Shows/MasterClass- Gordon Ramsay Teaches Cooking/Season 1/MasterClass- Gordon Ramsay Teaches Cooking - S01E10 - Mastering Ingredients- Fish & Shellfish.mp4 (r)}: Flush:
2019/04/30 18:41:33 DEBUG : &{TV Shows/MasterClass- Gordon Ramsay Teaches Cooking/Season 1/MasterClass- Gordon Ramsay Teaches Cooking - S01E10 - Mastering Ingredients- Fish & Shellfish.mp4 (r)}: >Flush: err=<nil>
2019/04/30 18:41:33 DEBUG : &{TV Shows/MasterClass- Gordon Ramsay Teaches Cooking/Season 1/MasterClass- Gordon Ramsay Teaches Cooking - S01E10 - Mastering Ingredients- Fish & Shellfish.mp4 (r)}: Release:
2019/04/30 18:41:33 DEBUG : TV Shows/MasterClass- Gordon Ramsay Teaches Cooking/Season 1/MasterClass- Gordon Ramsay Teaches Cooking - S01E10 - Mastering Ingredients- Fish & Shellfish.mp4: ReadFileHandle.Release closing
2019/04/30 18:41:33 ERROR : TV Shows/MasterClass- Gordon Ramsay Teaches Cooking/Season 1/MasterClass- Gordon Ramsay Teaches Cooking - S01E10 - Mastering Ingredients- Fish & Shellfish.mp4: ReadFileHandle.Release error: file already closed
2019/04/30 18:41:33 DEBUG : &{TV Shows/MasterClass- Gordon Ramsay Teaches Cooking/Season 1/MasterClass- Gordon Ramsay Teaches Cooking - S01E10 - Mastering Ingredients- Fish & Shellfish.mp4 (r)}: >Release: err=file already closed
2019/04/30 18:41:35 DEBUG : pacer: Rate limited, sleeping for 8.248286384s (4 consecutive low level retries)
2019/04/30 18:41:35 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2019/04/30 18:41:40 DEBUG : pacer: Resetting sleep to minimum 100ms on success
2019/04/30 18:41:40 DEBUG : 22nsspm63sdhbthmkerda60n1g/s6vk1li5cn6s6gggfi9cabtn5jpu2vm1rredlmnlhpchh7c7cu758m5its77bgmh015312r9sejv0: list: read 0 from source
2019/04/30 18:41:40 DEBUG : 22nsspm63sdhbthmkerda60n1g/s6vk1li5cn6s6gggfi9cabtn5jpu2vm1rredlmnlhpchh7c7cu758m5its77bgmh015312r9sejv0: list: source entries: []
2019/04/30 18:41:40 DEBUG : 22nsspm63sdhbthmkerda60n1g/s6vk1li5cn6s6gggfi9cabtn5jpu2vm1rredlmnlhpchh7c7cu758m5its77bgmh015312r9sejv0: list: cached directories: 0
2019/04/30 18:41:40 DEBUG : 22nsspm63sdhbthmkerda60n1g/s6vk1li5cn6s6gggfi9cabtn5jpu2vm1rredlmnlhpchh7c7cu758m5its77bgmh015312r9sejv0: list: cached dir: '22nsspm63sdhbthmkerda60n1g/s6vk1li5cn6s6gggfi9cabtn5jpu2vm1rredlmnlhpchh7c7cu758m5its77bgmh015312r9sejv0', cache ts: 2019-04-30 18:41:40.57403344 -0700 PDT m=+49.135994873

I have taken animosity22’s homescripts repo and adapted them slightly. First off I changed the folder path from /GD to /mnt/gdrive, and changed the config file to meet my needs appropriate (team drive, encryption, etc). However, I noticed that they had made some changes to their scripts and I have taken some appropriate action.

gmedia.service

[Unit]
Description=gdrive
After=network-online.target
Wants=network-online.target

[Service]
# The dummy program will exit
Type=oneshot
# Execute a dummy program
ExecStart=/bin/true
# This service shall be considered active after start
RemainAfterExit=yes

[Install]
# Components of this application should be started at boot time
WantedBy=multi-user.target

mnt-gdrive.mount

[Unit]
Description = /gdrive MergerFS mount
PartOf=gmedia.service
After=gmedia-rclone.service
RequiresMountsFor=/data

[Mount]
What = /gdrive
Where = /mnt/
Type = fuse.mergerfs
Options = defaults,sync_read,auto_cache,use_ino,allow_other,func.getattr=newest,category.action=all,category.create=ff

[Install]
WantedBy=gmedia.service

gmedia-rclone.service

[Unit]
Description=RClone Service
PartOf=gmedia.service

[Service]
Type=notify
Environment=RCLONE_CONFIG=/opt/rclone/rclone.conf

ExecStart=/usr/bin/rclone mount gmedia: /mnt/gdrive \
--allow-other \
--dir-cache-time 192h \
--drive-chunk-size 32M \
--log-level DEBUG \
--log-file /var/log/rclone.log \
--timeout 3h \
--umask 002 \
--rc

ExecStop=/bin/fusermount -u /mnt/gdrive
Restart=on-failure
User=user
Group=user

[Install]
WantedBy=gmedia.service

gmedia-find.service

[Unit]
Description=gmedia find
PartOf=gmedia.service
After=mnt-gdrive.mount

[Service]
Type=simple

ExecStart=/usr/bin/rclone rc vfs/refresh recursive=true
RemainAfterExit=yes
User=user
Group=user

[Install]
# Components of this application should be started at boot time
WantedBy=gmedia.service

rclone.conf

[gdrive]
type = drive
scope = drive
token = OMIT
#service_account_file = /opt/rclone/service-account.json
team_drive = OMIT


[gdrivegames]
type = drive
scope = drive
token = OMIT
#service_account_file = /opt/rclone/service-account.json
team_drive = OMIT

[gdrivesoftware]
type = drive
scope = drive
token = OMIT
#service_account_file = /opt/rclone/service-account.json
team_drive = OMIT

[gdrivemedia]
type = drive
scope = drive
token = OMIT
#service_account_file = /opt/rclone/service-account.json
team_drive = OMIT

[gcache]
type = cache
remote = gdrive:media
plex_url = https://plex.localnet:32400
plex_username = OMIT
plex_password = OMIT
chunk_size = 1G
chunk_total_size = 32G
info_age = 2d

[gmedia]
type = crypt
remote = gcache:
filename_encryption = standard
directory_name_encryption = true
password = OMIT
password2 = OMIT

I am not sure what or how to progress from here - and what I can do to prevent the pausing issues.

I do also have some other issues, like the mnt-grive.mnt not wanting to cooperate due to it’s file name or an Invalid Argument

Apr 30 18:40:46 plex kernel: [    3.715011] systemd[1]: mnt-gdrive.mount: Where= setting doesn't match unit name. Refusing.
Apr 30 18:40:46 plex kernel: [    3.740317] systemd[1]: mnt-gdrive.mount: Cannot add dependency job, ignoring: Unit mnt-gdrive.mount is not loaded properly: Invalid argument.

But I would love any assistance anyone could provide.

I’m don’t use cache but a 1G cache chunk seems means it has to download a full 1G each time it does anything. I would set that to something reasonable like 32M.

You are also getting 403 rate limits as I’m guessing you haven’t setup your own client ID/API key:

https://rclone.org/drive/#making-your-own-client-id

I’d also remove the plex integration as that just slows things down.

Let’s start there and see if you can fix the chunk size and get your own KEY setup and see how that works.

Nevermind, I see the mistake I made there. Had user listed instead of my actual username.

When I add the client_id and secret_key (getting rid of the service_account_file) I now receive the following under rclone.log:

2019/04/30 21:42:56 INFO  : plex: stopped Plex watcher
2019/04/30 21:42:56 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/04/30 21:42:57 INFO  : gcache: Cache DB path: /home/user/.cache/rclone/cache-backend/gcache.db
2019/04/30 21:42:57 INFO  : gcache: Cache chunk path: /home/user/.cache/rclone/cache-backend/gcache
2019/04/30 21:42:57 INFO  : gcache: Chunk Memory: true
2019/04/30 21:42:57 INFO  : gcache: Chunk Size: 16M
2019/04/30 21:42:57 INFO  : gcache: Chunk Total Size: 32G
2019/04/30 21:42:57 INFO  : gcache: Chunk Clean Interval: 1m0s
2019/04/30 21:42:57 INFO  : gcache: Workers: 4
2019/04/30 21:42:57 INFO  : gcache: File Age: 2d
2019/04/30 21:42:57 INFO  : gcache: Cache DB path: /home/user/.cache/rclone/cache-backend/gcache.db
2019/04/30 21:42:57 INFO  : gcache: Cache chunk path: /home/user/.cache/rclone/cache-backend/gcache
2019/04/30 21:42:57 INFO  : gcache: Chunk Memory: true
2019/04/30 21:42:57 INFO  : gcache: Chunk Size: 16M
2019/04/30 21:42:57 INFO  : gcache: Chunk Total Size: 32G
2019/04/30 21:42:57 INFO  : gcache: Chunk Clean Interval: 1m0s
2019/04/30 21:42:57 INFO  : gcache: Workers: 4
2019/04/30 21:42:57 INFO  : gcache: File Age: 2d
2019/04/30 21:43:57 INFO  : Google drive root 'media': Change notify listener failure: googleapi: Error 403: The attempted action requires shared drive membership., teamDriveMembershipRequired
2019/04/30 21:43:57 INFO  : Google drive root 'media': Change notify listener failure: googleapi: Error 403: The attempted action requires shared drive membership., teamDriveMembershipRequired
2019/04/30 21:44:01 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:01 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:54 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:54 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:54 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:54 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:55 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:55 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:55 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:55 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:55 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:55 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:56 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:56 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:56 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:56 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:56 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:56 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:57 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:57 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:57 ERROR : 87hkgcvbcmaghht2oiaj8va3t8/qa0mi5rhdj2p7vd60afrf8etek/4n8l1g3pdlnps563tocir52g1c/mnr8dl15nbqi0drk7vvfde2lb89u430jqkc96vaoci38ee5r1cl0: error refreshing object in : in cache fs Google drive root 'media': object not found
2019/04/30 21:44:57 INFO  : Google drive root 'media': Change notify listener failure: googleapi: Error 403: The attempted action requires shared drive membership., teamDriveMembershipRequired

And I do have the clientID under: Manage API client access in Google Admin > Security > Advanced Settings > Manage Authentication

If I go and try to refresh the token associated with the rclone project I receive the following error:

If your browser doesn't open automatically go to the following link: https://accounts.google.com/o/oauth2/auth?access_type=offline
Log in and authorize rclone for access
Enter verification code> 
2019/04/30 22:05:20 Failed to configure token: failed to get token: oauth2: cannot fetch token: 401 Unauthorized
Response: {
  "error": "invalid_client",
  "error_description": "Unauthorized"
}

I think this is where these instructions are confusing to be honest:

  1. Log into the Google API Console with your Google account. It doesn’t matter what Google account you use. (It need not be the same account as the Google Drive you want to access)
  2. Select a project or create a new project.
  3. Under “ENABLE APIS AND SERVICES” search for “Drive”, and enable the then “Google Drive API”.
  4. Click “Credentials” in the left-side panel (not “Create credentials”, which opens the wizard), then “Create credentials”, then “OAuth client ID”. It will prompt you to set the OAuth consent screen product name, if you haven’t set one already.
  5. Choose an application type of “other”, and click “Create”. (the default name is fine)
  6. It will show you a client ID and client secret. Use these values in rclone config to add a new remote or edit an existing remote.

If these are the instructions above:

  1. Sure, easy - I already have a project created (in this example it is called TEST)
  2. See above, TEST
  3. Enabled
  4. This is where it can be confusing, there are two “credentials locations”.
    image

This is what happens if I immediately follow through from step three.

If I do what I think is being asked, you want me to get to this screen:

Which then prompts me to make an oAuth Consent page (which I make adding all Google Drive scopes and creating it to be internal, so I don’t have to wait for it to get published):

So now that the oAuth screen is done, I can create the other credential:
image

Which then gets me the required credentials
image

Which I put in the configuration file, follow the steps to re-auth my token, and get the error log above (And below)

If your browser doesn't open automatically go to the following link: https://accounts.google.com/o/oauth2/auth?access_type=offline
Log in and authorize rclone for access
Enter verification code> 
2019/04/30 22:05:20 Failed to configure token: failed to get token: oauth2: cannot fetch token: 401 Unauthorized
Response: {
  "error": "invalid_client",
  "error_description": "Unauthorized"
}

You need to run rclone config to add these in. If you add them in manually, you need to run rclone authorize.

That is exactly what I am doing in the above error log:

If I run through rclone config and edit my google drive instance:

--------------------
[gdrive]
type = drive
scope = drive
token = {"access_token":"omit}
client_id = omit
secret_key = omit
team_drive = omit
--------------------
y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d> y
Remote config
Already have a token - refresh?
y) Yes
n) No
y/n> y
Use auto config?
 * Say Y if not sure
 * Say N if you are working on a remote or headless machine
y) Yes
n) No
y/n> N
If your browser doesn't open automatically go to the following link: https://accounts.google.com/o/oauth2/auth?access_type=offline
Log in and authorize rclone for access
Enter verification code> OMIT
2019/05/01 07:38:33 Failed to configure token: failed to get token: oauth2: cannot fetch token: 401 Unauthorized
Response: {
  "error": "invalid_client",
  "error_description": "Unauthorized"
}

If I run the rclone authorize command directly, it runs it locally rather than allowing headless functionality (since this machine is CLI/Server based). I can’t seem to find the flag for headless mode.

$ rclone authorize drive --drive-acknowledge-abuse --drive-chunk-size 32M --drive-client-id OMIT --drive-client-secret OMIT  --drive-keep-revision-forever --drive-team-drive OMIT -n -P
If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth
Log in and authorize rclone for access
Waiting for code...

What happens when you punch in the URL into a browser and sign in with your account? You don’t get a code to paste in?

If I punch in the code from the first preformatted text block, yes, I get the code and it comes back with wha is shown.

I can’t do that with the second one since it is only listening on the plex server which again has no UI and that isn’t how ports work when listening on 127.0.0.1 - only the local device (in this case the machine called plex - not any other machine) can see the contents running on that port:

@plex:~$ sudo netstat -apn | grep -i 53682
tcp        0      0 127.0.0.1:53682         0.0.0.0:*               LISTEN      17619/rclone

ens160: flags=4163<UP,BROADCAST,RUNNING,MULTICAST>  mtu 1500
        inet 192.168.2.8  netmask 255.255.255.0  broadcast 192.168.2.255
        ether 00:0c:29:d8:d5:9a  txqueuelen 1000  (Ethernet)
        RX packets 5922331  bytes 8306844949 (8.3 GB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 412172  bytes 92536380 (92.5 MB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

image

I could try doing a proxy over putty - but since the computer that is running this needs the data (In this case Plex machine) and it is waiting to obtain a code automatically, I don’t think it will work.

If you are not familiar with 127.0.0.1 vs 0.0.0.0 please read this.

When you select headless, all you need a machine connected to the Internet (like the one you are posting on the forums) and copy and paste the whole URL into your browser and copy and paste the code back in.

You don’t need to do anything other than that.

And thats exactly what I am doing (for the rclone config command). That reuslts in this error:

2019/05/01 07:38:33 Failed to configure token: failed to get token: oauth2: cannot fetch token: 401 Unauthorized
Response: {
  "error": "invalid_client",
  "error_description": "Unauthorized"
}

For rclone authorize command I cannot find a flag to run the command in headless mode.

That means that the account you are using doesn’t have access. When you enter in your account information, what screen do you get?

I tested the process by linking my home account (with a freshly created client/secret) to my GSuite account and I get a notification that my project “Home” has been given access to my GSuite.

So the process definitely works but I am not sure from the steps, which are getting mixed up.

Are you doing the above on a headless machine? That looks like you are using a machine with a user interface?



   2019/05/01 08:28:23 Failed to configure token: failed to get token: oauth2: cannot fetch token: 401 Unauthorized
Response: {
  "error": "invalid_client",
  "error_description": "Unauthorized"
}

If I go into G Suite Admin console, I can see that my previous instances of the API requests have been installed and completed:

But not the current one rclone-domain.org

If I go ane explicitly trust it in the API Calls:

I still see this issue occur.

I am using:
rclone --version
rclone v1.46

  • os/arch: linux/amd64
  • go version: go1.11.5

I’m not sure what you are looking in as the APIs are setup in the Google Cloud Console.

Nothing is needed in the GSuite admin.

I’m also not sure why you’d be seeing rclone.org if you are using your own key.

If you check your account and you have it properly authorized, you should see your project name in 3rd party apps:

“Home” was the new project i named/created and authorized into my GSuite account.

Seems like something malformed the content of the line when copying and pasting from GCP, over putty, into nano.

After running that and attempting to reauth the token, it worked fine and I am no longer seeing any errors in the rclone.log file and seeing API requests in Cloud Console.

However, with that - I am still unable to play 4K movies without stuttering, downscaling, or buffering. Not sure if that is more related to the Plex side or the rclone cache issue for the buffer.

I set the cache side buffer to 16M instead of 32M.

Let’s perhaps take a step and make sure we’re all on the same page.

At this point, the API keys and such are solved? Yes?

Can you share the final mount command that you are using?

For the playback, are you direct playing or transcoding? What’s the client that is playing it?

If you want to start the mount with -vv and grab a debug of when you play, we can validate rclone is doing the right thing.

I play pretty high bitrate 4k movies without a problem but a few players have problems in general. NVidia Shield has been awful the last 4-5 months with direct play as that’s a popular one and certain other players tend to open/close files a lot.

My player of choice are Apple TVs and I use a ATV4K so that’s my setup.

At this point, the API keys and such are solved? Yes?

Yes, they are working.

Can you share the final mount command that you are using?

=/usr/bin/rclone mount gmedia: /mnt/gdrive \
--allow-other \
--dir-cache-time 192h \
--drive-chunk-size 16M \
--log-level DEBUG \
--log-file /var/log/rclone.log \
--timeout 3h \
--umask 002\
--rc

For the playback, are you direct playing or transcoding? What’s the client that is playing it?

I am trying to play a 4K feed on a 1080P monitor that is attached to my desktop (more than powerful enough to handle a 4K feed. When playing on my desktop, it unfortuantely uses the transcoder, even though I have it set to not convert automatically. But, I guess in this case, due to it being a 1080p screen, it tries to downscale.

However, I also am using a nVidia Shield as well hooked to a 4K tv.I prefer Android over Apple devices, and I wouldn’t go out and buy an apple device purely to watch a 4K movie. When I play it on the TV, as far as I can see from the plex server logs and running htop, it does not use the transcoder.

As for using -vv (I assume this is debug), I see this in the log:

2019/05/01 20:46:04 DEBUG : Cache remote gcache:: starting cleanup
2019/05/01 20:46:04 DEBUG : Google drive root ‘media’: Checking for changes on remote
2019/05/01 20:46:04 DEBUG : Cache remote gcache:: starting cleanup
2019/05/01 20:46:04 DEBUG : Google drive root ‘media’: Checking for changes on remote
2019/05/01 20:47:04 DEBUG : Cache remote gcache:: starting cleanup
2019/05/01 20:47:04 DEBUG : Google drive root ‘media’: Checking for changes on remote
2019/05/01 20:47:04 DEBUG : Cache remote gcache:: starting cleanup
2019/05/01 20:47:04 DEBUG : Google drive root ‘media’: Checking for changes on remote
2019/05/01 20:48:04 DEBUG : Cache remote gcache:: starting cleanup
2019/05/01 20:48:04 DEBUG : Google drive root ‘media’: Checking for changes on remote
2019/05/01 20:48:04 DEBUG : Cache remote gcache:: starting cleanup
2019/05/01 20:48:04 DEBUG : Google drive root ‘media’: Checking for changes on remote

While the video (in this case, The Predator in 4K) plays on my computer (about 1 hour and 30 minutes in to the movie).

But nothing else.

In those rclone debug, there are no signs of the file being played from the rclone mount as those are normal polling messages.

If you resolution doesn’t match, it has to transcode so would never be able to direct play 4k to 1080p.

The shield has been particularly problematic for the last number of months so nothing rclone would do would solve any of those playback issues:

If you have a log of playback that is problematic perhaps not on a nvidia shield unless you grab the beta, I’m happy to look at it.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.