Download quota exceed - even with a new unique id?

Hi guys.

Been having some problems with my server. Made clean install - after several reinstallation's - ( Had some major issues with a new NUC ) , im getting :

Failed to copy: failed to open source object: open file failed: googleapi: Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded

I haven't downloaded anything , and as I recall the download quote from gdrive is what 10TB pr. day?

I even tried to change the unique id , problem still persist.

Any ideas?

Thanks

hi,

is the file shared with others?

without any actionable info, kinda of hard to help you.
when you posted there was a template of questions.....

Hi

Ahh sorry my bad..

No the file and id is not shared at all.

Simple plex server . :slight_smile:

Ununtu desktop 20.04
Latest official Rclone version.

Log file is empty due a new installation.

I just want to copy a small file from the gdrive to my server , and i saw that error code .

really, can you please post the requested info.

Sure - I can try :slight_smile:

What is the problem you are having with rclone?

What is your rclone version (output from rclone version)

rclone v1.55.0

  • os/type: linux

  • os/arch: amd64

  • go/version: go1.16.2

  • go/linking: static

  • go/tags: cmount

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu Desktop 20.04.LTS

Which cloud storage system are you using? (eg Google Drive)

Googledrive Workspace

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Rclone copy Scripts: /home/plex/scripts/

The rclone config contents with secrets removed.

[Google]
type = drive
client_id = XXXXXXXXXXX.apps.googleusercontent.com
client_secret = XXXXXXXXX
scope = drive
token = {"access_token":"XXXXXXXX","expiry":"2021-04-10T21:28:20.970971205+02:00"}

[Googlecrypt]
type = crypt
remote = Google:Private
filename_encryption = standard
directory_name_encryption = true
password = XXXXX

A log from the command with the -vv flag

Dont have a log yet - new install. Only errors I see in ssh is that download quote exceeded with 403

that command uses Scripts:, but that is not included in your config file?

run the command with -vv and post the output.

how are you using plex with rclone, as you would need a rclone mount command for that?

Ahh correct , only copied the mount and crypt :confused:

I use systemd
‘’’
[Unit]
Description=RClone Service
After=network-online.target
Wants=network-online.target

[Service]
Type=notify
ExecStart=/usr/bin/rclone mount --allow-other --dir-cache-time 1000h --log-level INFO --log-file /home/plex/logs/rclone.log --bwlimit-file 16M Googlecrypt: /home/plex/mnt
ExecStop=/bin/fusermount -uz /home/plex/mnt
Restart=on-abort
User=plex
Group=plex

[Install]
WantedBy=multi-user.target

‘’’

My server is is indexing at the moment.

Let my try with -VV :slight_smile: brb

it could be the indexing that is causing the problem.
might want to tweak that
https://github.com/animosity22/homescripts#plex-tweaks
and use this as a template for systemd
https://github.com/animosity22/homescripts/blob/master/systemd/rclone.service

Oh but been using it for ages , no problems so far.

Well here is the details :slight_smile:
´´´
[Scripts]

type = crypt

remote = Google:Scripts

filename_encryption = standard

directory_name_encryption = true

password = xxxxx
´´´

Rclone -vv copy Scripts: /home/plex/scripts/

´´´
rclone copy -vv Scripts: /home/plex/scripts/

2021/04/10 21:17:58 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"

2021/04/10 21:17:58 DEBUG : rclone: Version "v1.55.0" starting with parameters ["rclone" "copy" "-vv" "Scripts:" "/home/plex/scripts/"]

2021/04/10 21:17:58 DEBUG : Creating backend with remote "Scripts:"

2021/04/10 21:17:58 DEBUG : Creating backend with remote "Google:Scripts"

2021/04/10 21:17:58 DEBUG : Google drive root 'Scripts': root_folder_id = "XXXXXXXX" - save this in the config to speed up startup

2021/04/10 21:17:59 DEBUG : Creating backend with remote "/home/plex/scripts/"

2021/04/10 21:18:02 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXX, userRateLimitExceeded)

2021/04/10 21:18:02 DEBUG : pacer: Rate limited, increasing sleep to 1.239365217s

2021/04/10 21:18:02 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXX, userRateLimitExceeded)

2021/04/10 21:18:02 DEBUG : pacer: Rate limited, increasing sleep to 2.793718262s

2021/04/10 21:18:02 DEBUG : pacer: Reducing sleep to 0s

2021/04/10 21:18:02 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXX, userRateLimitExceeded)

2021/04/10 21:18:02 DEBUG : pacer: Rate limited, increasing sleep to 1.71281199s

2021/04/10 21:18:02 DEBUG : pacer: Reducing sleep to 0s

2021/04/10 21:18:02 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXX, userRateLimitExceeded)

2021/04/10 21:18:02 DEBUG : pacer: Rate limited, increasing sleep to 1.62468263s

2021/04/10 21:18:03 DEBUG : pacer: Reducing sleep to 0s

2021/04/10 21:18:03 INFO : exclude-file.txt: Copied (new)

2021/04/10 21:18:03 INFO : .plexignore: Copied (new)

2021/04/10 21:18:04 INFO : mount.sh: Copied (new)

2021/04/10 21:18:04 INFO : kill_stream.py: Copied (new)

2021/04/10 21:18:04 INFO : demount.sh: Copied (new)

2021/04/10 21:18:04 INFO : gdrive2synology.sh: Copied (new)

2021/04/10 21:18:04 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXX, userRateLimitExceeded)

2021/04/10 21:18:04 DEBUG : pacer: Rate limited, increasing sleep to 1.028830292s

2021/04/10 21:18:04 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXXXXX, userRateLimitExceeded)

2021/04/10 21:18:04 DEBUG : pacer: Rate limited, increasing sleep to 2.422275949s

2021/04/10 21:18:05 DEBUG : pacer: Reducing sleep to 0s

2021/04/10 21:18:05 INFO : rclone-sync.sh: Copied (new)

2021/04/10 21:18:05 INFO : plexrestore.sh: Copied (new)

2021/04/10 21:18:05 INFO : pihole.sh: Copied (new)

2021/04/10 21:18:05 INFO : purge-old.kernels-2.sh: Copied (new)
´´´

The tweaks are OK - Dont have that much online :slight_smile:

And don't use vfs cache :slight_smile: - have been running this setup for ages - I just have had INSANE issues with the new Intel NUC 11 Pro. i225 lan made serious issue with Vmware esxi 7.0 even though I used the new network Fling .
After several retries ( after several several of indexing ) I went back to the old setup which was ubuntu desktop .

did not see that in the log you posted.
might want to login to the gdrive api page and see what is going on.

i do not use plex, i use jellyfin and i do not use gdive but i have seen enough posts about it.

if you keep re-running the indexing over and over on the same files and have plex do things like Perform extensive media analysis during maintenance then you can run into problems.

we have very different media servers.
right now, i am using a pi zero running ums - universal media server, pointing to a cheap seedbox.

Its disabled "Perform extensive media analysis during maintenance"

ok. now that we have the needed info, we have plex/gdrive experts and i expect one will stop by soon.

1 Like

OH the download quote is GONE now :slight_smile:

Why - don't got a clue

Thanks for your effort to help mate - appreciated :slight_smile:

Learning every day :slight_smile:

I have this problem too, but I am not even using the server a lot those days. At a certain point of a movie it just stops and rclone start spamming this error for half a day.
Google quotas are okey, this is something probabilly undocumented. If I try to download that file manually from drive (web browser) it says " Google Drive Download Limit (Quota Exceeded)" for every file in that drive. Even if, as I said, I just wanted to watch ONE movie of ~4GB. The drive is only mine and only used via my server.

My problem startet with this after my upgrade of rclone to the latest version. But I think my problem might be related to all the scans I made. Just a guess

I already disabled any scan I can think of, except one that starts when it sees something 'new'. But just an indexing 0 download.

This is indeed strange. Maybe rclone is downloading the same file a lot of times because of 'data fragments' (like the streaming service, only the first part then the second etc.) and google counts each of that as a whole download and there's a drive really-low download limit?

PS: Even with another account I can't download anything from that drive. This is probably a drive-quota limit.