GDrive Error 403 - Next Steps?

What is the problem you are having with rclone?

GoogleAPI Error 403 --- User Rate Limit Exceeded

What is your rclone version (output from rclone version)

Version 1.55.1 (On OSX and pasted into /usr/bin)

Which OS you are using and how many bits (eg Windows 7, 64 bit)

OSX 11.5.1, 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive via Google Workspace Enterprise Plus

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy putio:Plex/Movies gdrive-media-crypt:Plex/Movies --progress --ignore-size

The rclone config contents with secrets removed.

gdrive-media
gdrive-media-crypt
putio

A log from the command with the -vv flag

Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

Quick backstory. I have been using Put.io and rapidly approached 10 TB of storage. I used a guide and searching around the forums to sign up for GDrive via Enterprise Workspace, set up my OAuth and other tokens, got encryption running, etc.

What I am trying to do is move all of my already set up content from Put.io to GDrive. I did so, and things have begun copying (2.1 TB of data). However, after about 13 hours, I hit the 403 error. I am assuming this is 750 GB data cap per day. Rclone has since automatically stopped transfers and is sitting at idle.

My main question is: (1) Do I leave rclone running with the failures until it completes the 2.1 TB transfer and let it keep re-trying until successful? OR (2) Do I cancel, re-run with --fast-list or --max-transfer to around 600 GB per day? All files and data in this particular transfer do not have sub-folders. The next transfer of content will.

I appreciate any help you can provide. This forum has been a phenomenal resource, and I am sorry I could not get a definitive answer via search.

Missing your rclone.conf with details :frowning:

And with a single line of a log, it's impossible to guess if it's the API limit or a rclone.conf misconfiguration.

@Animosity022 Thanks for the quick reply. I'll have to yank the file from my home folder for you.

The command is still running, and spits that error over and over again for every file, after some successful transfers. The error is below.

Would a screen capture help, while I grab the config?

Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

You need to share your rclone.conf.

Example of mine:

[GD]
type = drive
scope = drive
use_trash = false
client_id = redacted
client_secret = redacted
token = {"access_token":"redacted","token_type":"Bearer","refresh_token":"redacted","expiry":"2021-08-13T12:35:22.812636566-04:00"}
chunk_size = 1024M
root_folder_id = redacted

[gcrypt]
type = crypt
remote = GD:crypt
filename_encryption = standard
password = redacted
password2 = redacted
directory_name_encryption = true

For the command, you need to -vv to it and grab the whole output.

Output in my example looks like:

rclone lsd -vv GD:
2021/08/13 12:36:12 DEBUG : Setting --config "/opt/rclone/rclone.conf" from environment variable RCLONE_CONFIG="/opt/rclone/rclone.conf"
2021/08/13 12:36:12 DEBUG : Setting --user-agent "animosityapp" from environment variable RCLONE_USER_AGENT="animosityapp"
2021/08/13 12:36:12 DEBUG : Setting --rc-user "felix" from environment variable RCLONE_RC_USER="felix"
2021/08/13 12:36:12 DEBUG : Setting --rc-pass "felix" from environment variable RCLONE_RC_PASS="felix"
2021/08/13 12:36:12 DEBUG : Setting default for drive-pacer-min-sleep="10ms" from environment variable RCLONE_DRIVE_PACER_MIN_SLEEP
2021/08/13 12:36:12 DEBUG : Setting default for drive-pacer-burst="1000" from environment variable RCLONE_DRIVE_PACER_BURST
2021/08/13 12:36:12 DEBUG : rclone: Version "v1.56.0" starting with parameters ["rclone" "lsd" "-vv" "GD:"]
2021/08/13 12:36:12 DEBUG : Creating backend with remote "GD:"
2021/08/13 12:36:12 DEBUG : Using config file from "/opt/rclone/rclone.conf"
2021/08/13 12:36:12 DEBUG : Setting drive_pacer_min_sleep="10ms" from environment variable RCLONE_DRIVE_PACER_MIN_SLEEP
2021/08/13 12:36:12 DEBUG : Setting drive_pacer_burst="1000" from environment variable RCLONE_DRIVE_PACER_BURST
2021/08/13 12:36:12 DEBUG : GD: detected overridden config - adding "{TKSWb}" suffix to name
2021/08/13 12:36:12 DEBUG : Setting drive_pacer_min_sleep="10ms" from environment variable RCLONE_DRIVE_PACER_MIN_SLEEP
2021/08/13 12:36:12 DEBUG : Setting drive_pacer_burst="1000" from environment variable RCLONE_DRIVE_PACER_BURST
2021/08/13 12:36:12 DEBUG : fs cache: renaming cache item "GD:" to be canonical "GD{TKSWb}:"
          -1 2020-12-27 15:33:11        -1 backups
          -1 2019-09-29 21:57:58        -1 crypt
          -1 2021-07-26 20:43:15        -1 test
2021/08/13 12:36:12 DEBUG : 4 go routines active

Stop the command, add -vv, share the log.

It "sounds" like you are hitting the 750GB limit per day.

Just add:

https://rclone.org/drive/#drive-stop-on-upload-limit

and run it once per day.

hello and welcome to the forum,

not sure your use-case....
one option is to use https://rclone.org/docs/#bwlimit-bandwidth-spec set to value like 8.5M.
at that speed, rclone will never hit the 750GB limit and will keep running until the transfer is complete.

@Animosity022

rclone config here for you

[putio]
type = putio
token = {"access_token":"redacted","expiry":"0001-01-01T00:00:00Z"}

[gdrive-media]
type = drive
client_id = redacted
client_secret = redacted
scope = drive
root_folder_id = redacted
token = {"access_token":"redacted","token_type":"Bearer","refresh_token":"reacted","expiry":"2021-08-13T12:44:50.884014-04:00"}

[gdrive-media-crypt]
type = crypt
remote = gdrive-media:/media
filename_encryption = standard
directory_name_encryption = true
password = redacted

Should I re-run my original command with --vv ? (I usually run --vv on my mount commands to ensure I got rclone running as as daemon) I would think I would have to start over, and delete all current files? I might just add

--drive-stop-on-upload-limit

to the end there to see what happens, after deleting everything from GDrive.

-Thanks again, for your patience and help.

If I were you, I'd just add:

--drive-stop-on-upload-limit

and run once per day. I don't mess with bwlimit or anything else.

When I run

rclone ls gdrive-media-crypt:

I get a nice list of about 50 or so files that appear to have nested in rclone/media/Folder/Subfolder

I think its best to wipe the slate clean and add

--drive-stop-on-upload-limit

to the end of the command and run until I get my 2.1 TB / 512 or so files in Drive.

Regarding the underpinnings of RClone, how does it know which files are "done" or partial? I've always found that interesting. Thank you so much @Animosity022 . Good to know I can migrate away from Put.io and get the unlimited storage I need!

All,

Been slowly but surely moving things from Put.io over. This command worked!

rclone copy putio:Plex/Movies gdrive-media-crypt:Plex/Movies --progress --ignore-size --fast-list --drive-stop-on-upload-limit

I am going to have to play with some other things once the transfer finishes, since I never mounted my Put.io drive in Mac OSX as anything other than read-only and managed the files via web interface. Can't do that in this instance, since the file names, folder names, etc. are garbled due to encryption. So, I'd either need to mount as read/write or do

rclone rm

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.