Gettings 'downloadQuotaExceeded' Errors

What is the problem you are having with rclone?

I keep getting downloadQuotaExceeded errors despite turning off my Plex server for like 48 hours. It worked fine when I started it back up again but then after a few hours it gave me the same error. Is there anything I can do like make a new Google API key that will reset my quotas? Also any settings that I might have that are hitting the API too aggressively?

What is your rclone version (output from rclone version)

rclone v1.56.2

Which cloud storage system are you using? (eg Google Drive)

Google Drive (Shared Team Drive)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone mount gcrypt: /mnt/gdrive \
# This is for allowing users other than the user running rclone access to the mount
--allow-other \
# Google Drive is a polling remote so this value can be set very high and any changes are detected via polling.
--dir-cache-time 5000h \
# Log file location
--log-file /opt/rclone/logs/rclone.log \
# Set the log level
--log-level INFO \
# I reduce the poll internval down to 10 seconds as this makes changes appear fast the API quotas per day are huge
--poll-interval 15s \
# This is setting the file permission on the mount to user and group have the same access and other can read
--umask 000 \
--uid 1000
--gid 1000
# This sets up the remote control daemon so you can issue rc commands locally
--rc \
# This is the default port it runs on
--rc-addr :5572 \
# no-auth is used as no one else uses my server and it is not a shared seedbox
--rc-no-auth \
# This is used for caching files to local disk for streaming
--vfs-cache-mode full \
# This limits the cache size to the value below
--vfs-cache-max-size 100G \
# This limits the age in the cache if the size is reached and it removes the oldest files first
--vfs-cache-max-age 5000h \
# The polling interval for increased based on there is enough buffer space
--vfs-cache-poll-interval 5m \
# Add read ahead buffer to network latency issues
--vfs-read-ahead 2G

#--vfs-read-chunk-size 128M \
#--vfs-read-chunk-size-limit 2G \
#--buffer-size 64M

Mostly copied from Animosity022

The rclone config contents with secrets removed.

type = crypt
remote = gdrive:
filename_encryption = standard
directory_name_encryption = true
password = ***REDACTED***
password2 =  ***REDACTED***

type = drive
client_id =  ***REDACTED***
client_secret =  ***REDACTED***
scope = drive
token =  ***REDACTED***
team_drive = 0AL4_q-OEfKDtUk9PVA
root_folder_id =```

A log from the command with the -vv flag

2021/12/08 19:14:54 ERROR : media/tv/Good Place, The (2016) [tvdb-311711]/Season 02/The Good Place (2016) - S02E01 - Everything Is Great! [Bluray-1080p][8bit][x264][DTS 5.1]-SHORTBREHD.mkv: ReadFileHandle.Read error: low level retry 1/10: couldn't reopen file with offset and limit: open file failed: googleapi: Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded

API quotas are not related / tied to download/upload quotas.

You can see your API quotas in the console and hitting limits there just makes rclone slown down. Has nothing to do with upload / download limits.

You hit a download quota and not much to do other than wait as it looks like a shared drive. Is someone else using it as well?

I'm using one of those school team drives with unlimited storage, but I don't think the other team drives count as downloaded quotas.

I'd imagine school drives have more stringent quotas.

There's really not much to do other than wait unfortunately as Google Support won't tell you anything other than you downloaded too much.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.