Slow Down Query Requests To Google Drive?

I use Rclone mounts with Google Drive for my media that is fed into Plex. While scanning, Plex hits the Google API query limits due to the rate it reads. I'm not sure if there is a pre-existing feature I'm not understanding correctly, but it would be handy to have a flag we can set in mounting options that slows down the rate that queries are sent to Google Drive. For apps like Plex that do this auto scanning, it would just slow down the process much like scanning a HDD vs. an SSD, but it would be a worthwhile compromise to play nicely with the Google API. The query limits are listed on this page:

If this isn't currently a feature, I think it could be one that Google would definitely appreciate on their end. In my use case, I could set the flag until my initial Plex media scan is completed, then remove the flag so that the files can be accessed at full speed.

hi,
for rclone mount + gdrive + plex.

--- this has a set of optimized values, that uses the vfs file cache
https://github.com/animosity22/homescripts/blob/master/systemd/rclone-drive.service

--- this is another technique, does not use the vfs file cache, using the default of --vfs-cache-mode=off
https://forum.rclone.org/t/slow-scan-plex-tv-show/31492/16

--- about the api limits, notice that both rclone mount commands use the same, tested, optimized, settings.
--drive-pacer-min-sleep 10ms --drive-pacer-burst 200

Rclone already takes into account Google's rate limiting and follows their guidance for it.

So what you are asking for, is already done.

Although, a local HDD or SDD is always going to be much faster than any Cloud Storage as you aren't traversing the Internet to get your data.

hey, thanks,

based on your experience and settings,
do you think that rclone needs to tweak its default values?

If the numbers have changed globally as it looks like they did in the post to large values, I see no harm in changing the defaults as the original defaults were based on the old quota values.

Opposite of what the OP is asking but that's what I would do and did in my mount when I was Google Drive.

1 Like

Out of curiosity, what makes you think this is actually happening? What errors do you see in the log? How much data does Plex scan each time?

1 Like

Oh I missed that.

Yeah, it's virtually impossible to run out of API quota for a day as the rate limiting wouldn't allow it.

Sorry to hijack a thread, but I think I have a related question.

Would hitting the API limits over a few weeks consistently be a reason for Google to remove an unlimited share drive or close an unlimited account?

I had an account closed on me citing some streaming reason. I was streaming my personal music from there using cloudplayer, so I thought that was the reason.

Recently I was given access to a shared folder on an unlimited drive and was told Google deleted it for TOS violations. The only thing I did was use rclone to upload a few TB of data and then sync it daily. After two week at the most, they removed the shared folder.

Has anyone successfully used unlimited Google drive to backup multiple TB of data for more than a year?

Best to make a new post rather as all your information is missing.

No.

That's a content violation and nothing to do with API/uploads/etc.

Yes, lots of folks use rclone with Google Drive with many, many TB.

They could not see the content, as it was all encrypted.

As long as someone is getting to use it with many TB, then there is something else going on.

Thanks.

I think I may have solved my own problem without realizing it, but I'll try to give more context.

I'm actually trying to help a friend move his Plex server off of a VPS service to his own private machine. When he first started out with his Plex server he had a ton of stuff to scan in for metadata, and access to the media files that were linked through Rclone kept going down. He told me the scanner would periodically stop scanning around the same time, and when he contacted support for the VPS they told him it was the Google API limitation maxing out.

Since then I've helped him work on his server periodically. I'm kinda working off of his explanation of the problems he had as noted above, so I can't say I personally saw what Plex was doing at the time on the back end. Later on, long after his initial big metadata scan completed, I made some tweaks to his Rclone mounts and set them up through a Service account to make them run more efficiently.

Fast forward to the last few days, I've been helping him build a replacement server to run from home to eliminate the need for the VPS. I've held off replying to this thread for a couple days so I could recreate the problem he had, but everything seems to be scanning smoothly without losing file access this time. Whether it's because I'm using a service account or because of some other tweaks I made to mounting options, it doesn't seem to be a problem now. Sorry for this explanation of the problem being vague, but I'm kinda working off of a past problem I wasn't around to witness myself.