downloadQuotaExceeded with different service accounts

What is the problem you are having with rclone?

I hit the downloadQuotaLimit due to Plex scanning the whole library multiple times due to a bug I had in my Plex settings, s now I hit the download limit, trying changing the service accounts (after adding them as members to the team drive), but the error doesn't get fixed.

Command used to change the service account:

rclone backend set gdrive: -o service_account_file=/path/to/rclone/accounts/1.json

That's on my new server, a week ago on my old server it worked fine with the same configuration & service accounts.

Run the command 'rclone version' and share the full output of the command.

rclone v1.57.0
- os/version: ubuntu 20.04 (64 bit)
- os/kernel: 5.11.0-1022-oracle (aarch64)
- os/type: linux
- os/arch: arm64
- go/version: go1.17.2
- go/linking: static
- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Anything that downloads from Google Drive (scanning files for example, copying from server to GDrive works, but not the other way), but here's an example:

rclone copy -P gdrive:/files/somefile.txt /home/user/path/

The rclone config contents with secrets removed.

[gdrive]
type = drive
client_id = XXXX
client_secret = XXXX
scope = drive
token = {"access_token":"XXXX","token_type":"Bearer","refresh_token":"XXXX","expiry":"2022-01-21T11:34:12.368147935Z"
team_drive = XXXX
root_folder_id =
service_account_file = /user/.config/rclone/accounts/first_sa.json

A log from the command with the -vv flag

2022/01/21 10:39:30 DEBUG : rclone: Version "v1.57.0" starting with parameters ["rclone" "copy" "-P" "gdrive:/files/example_files/" "./example_files/" "-vv"]
2022/01/21 10:39:30 DEBUG : Creating backend with remote "gdrive:/files/example_files/"
2022/01/21 10:39:30 DEBUG : Using config file from "/user/.config/rclone/rclone.conf"
2022/01/21 10:39:31 DEBUG : fs cache: renaming cache item "gdrive:/files/example_files/" to be canonical "gdrive:files/example_files"
2022/01/21 10:39:31 DEBUG : Creating backend with remote "./example_files/"
2022/01/21 10:39:31 DEBUG : fs cache: renaming cache item "./example_files/" to be canonical "/user/.config/rclone/example_files"
2022-01-21 10:39:31 DEBUG : Local file system at /user/.config/rclone/example_files: Waiting for checks to finish
2022-01-21 10:39:31 DEBUG : Local file system at /user/.config/rclone/example_files: Waiting for transfers to finish
2022-01-21 10:39:35 DEBUG : Local file system at /user/.config/rclone/example_files: Waiting for transfers to finish
2022-01-21 10:39:35 ERROR : example.torrent: Failed to copy: failed to open source object: open file failed: googleapi: Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded
2022-01-21 10:39:35 ERROR : Attempt 3/3 failed with 1 errors and: failed to open source object: open file failed: googleapi: Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded

You hit a daily quota.

Nothing to do but wait for it to reset.

I thought service accounts were a workaround for this issue?

We don't advise any method to avoid Google's generous daily quotas.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.