Google Drive User rate limit exceeded

What is the problem you are having with rclone?

Rclone sync and copy used to work for 1 day (yesterday)

What is your rclone version (output from rclone version)

rclone v1.49.5

  • os/arch: windows/amd64
  • go version: go1.12.10

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows 10 - 64

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone -v sync drive:\examples D:\examples --max-transfer 650G --transfers=2 --checkers=4 -vv

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp)

2019/10/14 21:00:53 DEBUG : myfile /myfile.bak: Modification times differ by -721ms: 2019-10-12 05:29:10.721 +0000 UTC, 2019-10-12 00:29:10 -0500 CDT
2019/10/14 21:00:53 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=###########, userRateLimitExceeded)
2019/10/14 21:00:53 DEBUG : pacer: Rate limited, increasing sleep to 1.611034857s

So this worked yesterday, but today it keeps telling em this. Also there seems to be some updates to the API instructions for Google drive located here Use case - Google Apps/G-suite account and individual Drive. These instructions seem to be needing to be updated. Not sure I got my api set up. Is there a good tutorial to set this up?

Firstly - if you see these errors but your transfers seem to keep working normally, then you don't really need to worry about it. It is perfectly normal to have some rate-limit errors. This is just the Google server telling rclone to slow the pace down a bit (and google does this gracefully). You will never be able to totally remove these errors - and neither is there a reason to. They are simply repeated again in a short while and will go through after a couple of tries as worst.

But if you see nothing but rate-limit errors and nothing seems to be transferring then you have hit the 750GB/day upload limit. This uses the same error code so we can't really tell the difference purely from that. You probably know yourself is that is likely to have happened in the last 24hrs. If so you just have to wait a bit - but downloads should function normally.

The section from documention you reference is specifically for using service-accounts and running your own domain. Chances are that you don't even use service-accounts unless you are an advanced user. The default and most common way to authorize is via Oauth (where you get the web-browser pop up to allow rclone to access your Gdrive).

If you haven't already done so you should probably instead look into making your own API key (rather than using the shared default) because that will actually help you run into fewer rate-limit errors. It won't get against the 750GB quota obviously but you'll get your own API quota to use rather than sharing a big quota with lots of other rclone users. You can find the documentation for that on the same page:
https://rclone.org/drive/#making-your-own-client-id

1 Like

Ok so I followed the link and was able to put in my credentials into rclone. So 1 step down. However I am still having the issue with user limit exceeded. I figure I went over my qouta here, but it has been 2 days now.. here is what it is saying today.

2019/10/16 19:54:45 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=My project, userRateLimitExceeded)

2019/10/16 19:54:45 DEBUG : pacer: Rate limited, increasing sleep to 1.90321942s

It does have my project number in it, but still saying im over. anyone know how long that limit is good for?

To be very spesific here - are all your uploads stalling completely? Or are they working but you just have some of these errors? Because I said that is no a problem and is normal especially so the more --transfers you choose to use. I can't really tel which of the two is the case from just this one error-line, because the same error code is used for both scenarios.

If full stalling on uploads - it is almost certainly the upload limit of 750GB. This should reset on a given point ever 24hrs, but this time can vary depending on exactly what server you are connecting to.

If it's intermittent but things look to work normally despite it then you are just spiking above 1000API calls pr 100seconds sometimes, which as I said can happen under normal operation. Rclone deals with this gracefully by slowing down the pace of requests a little bit until it stops seeing the error.

THANK YOU FOR ALL OF YOUR HELP @thestigma !!!!!!

So yesterday I ran it for about a minute and it was continuing to sedn the error on everything. Today however I forgot to let it run longer, so I posted quickly. It is working properly and I am now uploading well.

Great :slight_smile: Glad you got it solved.

It is not so unusual I notice to get a few more rate-limit errors at the start of a large transfer because it is not only doing transfers but also doing a lot of list-requests (and both use the API limit). So it tends to burst pretty heavily at the start.

On Gdrive you can use the --fast-list flag to significantly cut down on both the time needed to list - as well as the API load. Recommended for most use-cases.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.