I use Rclone mounts with Google Drive for my media that is fed into Plex. While scanning, Plex hits the Google API query limits due to the rate it reads. I'm not sure if there is a pre-existing feature I'm not understanding correctly, but it would be handy to have a flag we can set in mounting options that slows down the rate that queries are sent to Google Drive. For apps like Plex that do this auto scanning, it would just slow down the process much like scanning a HDD vs. an SSD, but it would be a worthwhile compromise to play nicely with the Google API. The query limits are listed on this page:
If this isn't currently a feature, I think it could be one that Google would definitely appreciate on their end. In my use case, I could set the flag until my initial Plex media scan is completed, then remove the flag so that the files can be accessed at full speed.
--- about the api limits, notice that both rclone mount commands use the same, tested, optimized, settings. --drive-pacer-min-sleep 10ms --drive-pacer-burst 200
If the numbers have changed globally as it looks like they did in the post to large values, I see no harm in changing the defaults as the original defaults were based on the old quota values.
Opposite of what the OP is asking but that's what I would do and did in my mount when I was Google Drive.
Sorry to hijack a thread, but I think I have a related question.
Would hitting the API limits over a few weeks consistently be a reason for Google to remove an unlimited share drive or close an unlimited account?
I had an account closed on me citing some streaming reason. I was streaming my personal music from there using cloudplayer, so I thought that was the reason.
Recently I was given access to a shared folder on an unlimited drive and was told Google deleted it for TOS violations. The only thing I did was use rclone to upload a few TB of data and then sync it daily. After two week at the most, they removed the shared folder.
Has anyone successfully used unlimited Google drive to backup multiple TB of data for more than a year?
I think I may have solved my own problem without realizing it, but I'll try to give more context.
I'm actually trying to help a friend move his Plex server off of a VPS service to his own private machine. When he first started out with his Plex server he had a ton of stuff to scan in for metadata, and access to the media files that were linked through Rclone kept going down. He told me the scanner would periodically stop scanning around the same time, and when he contacted support for the VPS they told him it was the Google API limitation maxing out.
Since then I've helped him work on his server periodically. I'm kinda working off of his explanation of the problems he had as noted above, so I can't say I personally saw what Plex was doing at the time on the back end. Later on, long after his initial big metadata scan completed, I made some tweaks to his Rclone mounts and set them up through a Service account to make them run more efficiently.
Fast forward to the last few days, I've been helping him build a replacement server to run from home to eliminate the need for the VPS. I've held off replying to this thread for a couple days so I could recreate the problem he had, but everything seems to be scanning smoothly without losing file access this time. Whether it's because I'm using a service account or because of some other tweaks I made to mounting options, it doesn't seem to be a problem now. Sorry for this explanation of the problem being vague, but I'm kinda working off of a past problem I wasn't around to witness myself.