Rclone/gdrive scheduled tasks and performance through plex

What is the problem you are having with rclone?

Hi, i'm new to rlcone and need help understanding how plex/rclone/gdrive behaves when scanning/scheduled tasks. I hear alot about bans but i can't make sense of how i can avoid them. Scanning has been fine so far but i'm reluctant to perform a deep analysis (though scheduled tasks).

Also, performance through mount/plex seems slower than if i was streaming through kodi/gdrive app, i appreciate that there is more involved streaming through plex but sometimes it can take 30+ to even begin and can struggle with moderately high bitrates (20+mbps)

What is your rclone version (output from rclone version)

v1.51.0

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 7, 64 bit

Which cloud storage system are you using? (eg Google Drive)

google drive (gsuite)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

mount --allow-other 
--dir-cache-time 72h 
--drive-chunk-size 64M 
--log-level INFO 
--vfs-read-chunk-size 32M 
--vfs-read-chunk-size-limit off googlesuite1: X: 
--config "C:\Users\Naz\.config\rclone\rclone.conf" 
--vfs-cache-mode writes

i did follow a guide, can someone check these settings are still relevant?

The rclone config contents with secrets removed.

[googlesuite]
type = drive
scope = drive
token = {"access_token":"
root_folder_id = 0AMDQM8YeWkEoUk9PVA

[googlesuitecrypt]
type = crypt
remote = googlesuite:/media
filename_encryption = off
directory_name_encryption = false
password = 

A log from the command with the -vv flag

unsure what this is

Deep Analysis is only used for bandwidth limiting in Plex. If you are not using that option, there is no reason to turn it on. It basically does a full scan of the file to get the bitrate information for that file.

There's a lot to unpack in that statement as it really depends on a number of factors. Kodi/Gdrive I'd guess tends to direct play more but that is also dependent on your player and the content being played. If you are transcoding and have lower end server or are trying to transcode 4k content, it's not going to go well overall. Many folks, myself included, stream high bitrate 4K movies without issues but I have the players that direct play as I use ATVs and a higher end server on a gigabit connection so not many bottlenecks for me as a whole.

My personal take on scheduled tasks is here:

hi mate, thanks for your input,

Bandwidth limiting is the reason i'd want to use this feature. Plex always wants to reserve double the bandwidth until i perform a deep analysis. Saying that i used to rely on it due to low up speeds, would you say it's less of a priority coming from googles servers? If plex wants to reserve 30+mbps whilst the actual media is spiking at 30 but generally is using around 15mbps. would the end user need to have a 30mbps + connection?

There's a lot to unpack in that statement as it really depends on a number of factors. Kodi/Gdrive I'd guess tends to direct play more but that is also dependent on your player and the content being played. If you are transcoding and have lower end server or are trying to transcode 4k content, it's not going to go well overall. Many folks, myself included, stream high bitrate 4K movies without issues but I have the players that direct play as I use ATVs and a higher end server on a gigabit connection so not many bottlenecks for me as a whole.

oh wow, i wouldnt have thought gdrive were capable of handling 4k thats an eye opener, do they have any bandwidth limits that you know of? plex isn't transcoding, looking at my settings does everything seem ok or would you advise i change anything for better performance?

Thanks

hello and welcome to the forum,

--allow-other, does nothing on windows.

1 Like

Bandwidth limiting is a tough one as it adds transcoding overhead. The reason plex just doubles it is because it wants to ensure that it doesn't go over an amount of bandwidth since it's not analyzed.

If you limit to 8Mb/s on Plex and the file has spikes of say 10-12Mb/s, it has to transcode the file to 8Mb/s which adds CPU overhead on the server. If you are truly limited by network bandwidth and want to set this up, it's got a lot of overhead as it has to read every single you have fully to work. If you have a smaller pool of users, you can just ask them to turn on certain bitrate limits on their players. I get that's annoying to, but really depends where you want to put the work at and the size of your library.

Most of the time if you get a spikes of 30Mbs and they only have a 15Mb/s connection, you get buffering on the player as most players very tiny buffers and don't handle it well. If they only have a 15Mb/s connection, they should turn down their bitrate on their players to have a smooth experience.

Many folks do 4k streaming without an issue. Most of the defaults work out of the box as I don't use Windows so YMMV as we do have a number of Windows users that stream without issue.

I can normally max out my gigabit connection with rclone and GDrive so I'm sure 50-60 Mb/s stream isn't a problem.

1 Like

so in theory this 'reserved bandwidth' is irrelevant because google has alot more bandwidth to offer? if thats the case then i should be okay turning off deep analysis? can you confirm this is correct. It would be nice if we could analyse the media manually..
client side, speed generally isn't an issue as i keep as i keep my bitrate around 10mbps. My usual routine would be to run a scheduled task over night and it works very effectively for resolving the doubling of bitrates. Maybe if we could designate a directory locally which could analyse the media before we sync to gdrive there has to be some work around :confused:

Thanks for your help btw. i've posted the same thing on plex/datahoarders and was just blanked lol

Ahh okay i'll rid that, do the other settings seem legit to you?

cheers

Yeah, I can't imagine worrying about Google's capacity management vs your capacity management. I gotta believe you will always run out first :slight_smile:

If you have the capacity to stream at the rates for the clients, you should be fine. If you don't have the capacity to stream x streams at x bitrate, you'd want to limit it somewhere in the chain so you'd have to make that call.

All my new media is local for a period so it's analyzed (not deep) and uploaded after a period of time. I wouldn't begin to try to explain that process on Windows as I'm not familiar enough to answer those questions as my system is Linux based.

1 Like

my mount commands are always very simple.

one mount i point my emby to
rclone mount gcrypt x: --read-only

one mount if i want to copy files
rclone mount gcrypt: y:

and i set emby not to generate thumbnails, not to goto internet for metadata stuff.

1 Like

Cheers mate. That clears that up, i'll opt for switching off deep analysis and see how it performs :+1:

Hi again, just wondering if someone can help me make sense of whats happening now. Plex seems to be working and local content is working but when i try to select any mounted content through plex it just loads endlessly and then crashes the app. Content can be accessed through kodi / google drive /explorer fine. Any ideas?

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.