My problem is that I've received a Google's ban for 24 hours for exceed the download quota.. I'm not sure what is wrong in my config, but I think is very hard that I have exceeded the download limit for google..These mounted TD's are used by Plex in different librarys.
- os/arch: linux/amd64
- go version: go1.1
- Ubuntu 20.04 LTS \n \l
- Google Drive
/usr/bin/rclone mount Anime_Series: /home/plexcloud/Anime_Series/ --log-level INFO --log-file=/home/ubuntu/rclonecache/Anime_Series.log --allow-other --fast-list --tpslimit=10 --buffer-size=128M --cache-dir=/home/ubuntu/rclonecache/ --dir-cache-time=48h --vfs-read-chunk-size=64M --vfs-read-chunk-size-limit=off --vfs-cache-max-age=45m --vfs-cache-mode=writes
The rclone config contents with secrets removed.
type = drive
scope = drive
token = XXX
team_drive = XXX
Any idea or suggestion?
The first step would be sharing the details needed from the help and support template that you deleted.
I suggest you create your own client id and secret.
Other than that, you can try increasing the vfs cache time for all things, and pre-caching the directory structure before running the plex scan.
Add the following:
--rc --dir-cache-time 8760h --attr-timeout 8700h --poll-interval 20s --cache-dir="<your_cache_dir> --vfs-cache-mode full --vfs-cache-max-age 336h --vfs-cache-max-size <SizeYouWant>
And run the following to precache the whole directory structure:
rclone rc vfs/refresh -v --fast-list recursive=true
Whlie I use the above config, I don't use plex itself, I just browse the directory and play using Kodi. Still the above will help in reducing the API hits plex will do.
While solid advice, the API usage is not the issue as it's the upload/download quota per the OP as those are two different things.
Yes, I agree with you. My problem is the download quota limit, not the API usage. Thanks for reply btw.
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.