User rate limit exceeded - clarification

Hey, could I get some clarification on the user rate limit exceeded message. Here is the message:

Failed to copy: googleapi: Error 403: User rate limit exceeded, userRateLimit
Exceeded

Couple of questions:

  1. Which remote have I exceeded rate limit on? I was copying from gdrive to gdrive so could that be either?
  2. I assume we’re talking about (as the message says) the user’s rate limit here (i.e. ‘Queries per 100 seconds per user’)? But I didn’t set up an API credentials, but used the rclone’s one instead for the destination remote, what does that mean in this situation?
  3. Will this automatically trigger 24hr ban or is it a case of rclone slowing things down to keep copying?
  4. If I were to use my own API credentials and hit the rate limit on that, should I apply for a higher quota for the ‘Queries per 100 seconds per user’ or ‘Queries per 100 seconds’ or both?

Thanks

It could be either unfortunately, but most likely it will be the one you are copying data to.

Those message are a normal part of rate limiting that google does. Normally they don’t become failures though as rclone retries them. You can see exactly what is happening with the -vv flag.

Probably the latter.

I recommend you make your own credentials as rclone’s ones are very busy and google refused to give me any more quota :frowning:

You just ask for more quota and google will give you something, you don’t get a lot of choice!

Excellent, thank you for your reply ncw. Appreciated.

@ncw Looking at these images, am I right in understanding that I haven’t hit the quota on this credential? As a test, I used the same credentials for both remotes and this is the result in API console. The thing is I’m currently getting pretty much non-stop 403’s… Am I right in that the issue is not with the humber of calls per day or per 100 seconds per user but potentially with other quota-related issues per type of files I’m moving etc? Or am I just reading these charts wrong? :face_with_raised_eyebrow::grinning:

Have you transferred more than 750GB in 24hrs? That will cause 403s for 12-24hrs.

hi~

i am just ~3month into the rclone community.

just wanna let others know how i seems to work around the “user rate limit exceeded” problem – use the parameters as in the rclone browser’s default.

I wanna upload 0.6 million of small files (total 1TB) to google drive, as say in this thread:

as told by ncw, I made my own google ID.

when I wanna “rclone check" it (without parameters), it simply say the above limit.
And the command stopped after 2 errors?

From the error’s link, I checked quota:
query per day is 1,000,000,000.
query per 100 second is 10 000, query per 100 second per user is 1000.
the query i used on that day is just 2xx, xxx, way lower than the quota.

i emailed google to increase qps per user but they refused.

then i turn back to the default parameters used by rclone-browser:

/snap/bin/rclone --config /home/xxxx/grnprogs/rclone/rclone-cur.conf check -c --verbose --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 600s /media/xxxx/asdvd005-cp2r1-cfs/20180524-004349—asdvd005-cp2r1-iat011 yyyyy:base-a002/20180527-01/20180524-004349—asdvd005-cp2r1-iat011

after 11hours it checked 4xxxxx out of the 6xxxxx files.

I think it’s working.

so other new beings like me if have no idea on the command, could reference to rclone browser.

thank you.