Library scans on Jellyfin Media Server triggers rate limits very quickly

What is the problem you are having with rclone?

Getting download quota exceeded and rate limits by scanning my library mounted via Rclone . The library isn't that large.
and this happens even with incremental scans. once i trigger a scan, it takes circa ages to complete and within 1 hour i'm rate limited for 24 hours.

i emphasize that i do not watch or download the media. i simply trigger a library scan and it happens. on days i don't do any scans, i can watch media

Run the command 'rclone version' and share the full output of the command.

rclone v1.58.0

Yes

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone mount cineflix: /home/jellyfin/media --allow-other --allow-non-empty --dir-perms 0777 --umask 0000

The rclone config contents with secrets removed.

  GNU nano 4.8                                            /home/jellyfin/.config/rclone/rclone.conf                                                       
[cineflix]
type = drive
client_id = xxxx.apps.googleusercontent.com
client_secret =.........redacted...........................
scope = drive
token = {"access_token":"ya29.x-xx-CDGK9lsTyTlb7SiJ-x>
team_drive = xx
root_folder_id =

A log from the command with the -vv flag

https://gist.github.com/ashipaek0/adc0f768325adab6782ab656a9d3ffc8

hi,
good, that you redacted more information from your original post.

look like you are using a shared/team drive.
based on other forum topics,
Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded
is a common issue.

to reduce the amount of data that emby needs for rclone to read,
for the library, i disable thumbnails, previews, etc...

yes its a shared drive, but i'm the only one using it. no one else.

the download quota should be 750Gb daily, so its rather mystifying that a scan does that.

is there some flag to slow down the number of times rclone reads the files? and skip errors instead of retrying over and over?

i think these may be contributing to the problem

i do not use gdrive but @VBB does, tho not sure if it uses shared drives.

It does not use shared/team drives :wink:

I think the key to not going over the quota limits is to make use of Rclone's dir cache (see refresh command below) after you mount the drive. Note that my mount is static, as in read-only, and I do not make changes through it. I upload separately via RcloneBrowser, and once that's done, I run the refresh command to get the mount up to date. Then I run my daily Plex scan.

My mount command:

rclone mount --attr-timeout 5000h --dir-cache-time 5000h --drive-pacer-burst 200 --drive-pacer-min-sleep 10ms --no-checksum --poll-interval 0 --rc --read-only --user-agent ******* -v

My refresh command:

rclone rc vfs/refresh recursive=true --drive-pacer-burst 200 --drive-pacer-min-sleep 10ms --timeout 30m --user-agent

Thanks.

is there some list of all rclone flags and what they do i can read up on?

i didn't see some of these flags in the docs

Sure, see here:

Generally, you aren't going to configure a flag to get around hitting the quota as that tends to be another setting.

If you watch when Emby is doing its scan, how much are you downloading? Do you have some analysis on?

I don't have analysis on

and thats the problem. i'm not downloading anything. not even streaming. all i do is trigger a scan

Depending on what flags are set though and if it's scanned before, that downloads a bunch of stuff per file to do analysis.

You'd want to watch and see what is being downloaded when the scan happens.

do not need to trigger a full scan of the entire library every time.

let's say in the library you have a folder with tv shows.
and for a given tv show, you add a new season.
then only need to trigger scan for root folder of the tv show.

Quite honestly, full scan or partial scan doesn't really matter, as long as the dir cache is used. Plex scans my entire library once a day, and you guys know it's rather large :wink:

as @asdffdsa mentioned you don't need to trigger a full scan each time. I believe @asdffdsa has helped me in the past too on something similar like this. I use google drive and have a lot of data. I was receiving rate limits as well with Plex. I actually turned off all of Plex's refresh schedules and just run a powershell script daily on a schedule that hits the WEB API to refresh only new items that have been added. That seems to work better than what Plex was doing. Here is my mount command that works for me. It took some trial and error before I found the right one.

rclone mount --buffer-size 32m --dir-cache-time 1000h --drive-chunk-size 128M --log-level=NOTICE --log-file=c:\rclonelogs\%VDATE%_rclone_mount_log.txt --poll-interval 10s --rc --rc-no-auth --vfs-cache-mode full --vfs-cache-max-size 150G --vfs-cache-max-age 2000h --vfs-cache-poll-interval 5m --bwlimit-file 32M --config "C:\Users\administrator\.rclone.conf" "gdrive:Plex Folder" W:

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.