GDrive 403s without hitting limit

Hi,

Following some 403s a couple days ago, I shut down my crontab and waited for the reset. Was able to upload ~8-10GB until I started receiving 403s again.

2019/05/17 13:57:53 DEBUG : pacer: Rate limited, sleeping for 1.448741745s (1 consecutive low level retries)
2019/05/17 13:57:53 DEBUG : pacer: low level retry 1/1 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019/05/17 13:57:53 DEBUG : test: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10
2019/05/17 13:57:53 DEBUG : pacer: Resetting sleep to minimum 10ms on success

Drive webui is also unresponsive for new files. Console says I've used 6k out of 1B requests, and the most requests per 100s today is 561 out of 1,000.

Support is asking for exact request and response data, and I can also use their OAuth playground to replicate the issue. Is there any way to get this out of rclone (have tried -vv which got me the above)?

I thought this was related to the recent 403/429 issue, but the user agent doesn't help.

403s usually come from:

  • Not using your own client ID/ API key

https://rclone.org/drive/#making-your-own-client-id

  • Uploading more than the limits in a 24 period. Last I've heard/noticed since it isn't documented anywhere is 750GB per day and 100GB server side copies.

What's the actual command you are running / version and what is the full -vv output?

Your can use --dump bodies to get at it.

But Anomisity is correct.

I've been using my own client ID and token for at least 2 years without issue. I also made a new one today, just in case (no change).

Although I did upload 7.3-10GB in the morning, it quickly hit 403 errors.

Actual run:
pmow@dockerhost:/mnt/media$ rclone -vvvv copy test gdrive:
2019/05/17 14:30:05 DEBUG : rclone: Version "v1.45" starting with parameters ["rclone" "-vvvv" "copy" "test" "gdrive:"]
2019/05/17 14:30:05 DEBUG : Using config file from "/home/pmow/.config/rclone/rclone.conf"
2019/05/17 14:30:05 DEBUG : test: Couldn't find file - need to transfer
2019/05/17 14:30:05 DEBUG : pacer: Rate limited, sleeping for 1.924465159s (1 consecutive low level retries)
2019/05/17 14:30:05 DEBUG : pacer: low level retry 1/1 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019/05/17 14:30:05 DEBUG : test: Received error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded - low level retry 1/10

This is v1.45, test is a plaintext file with some pasted text of 194 bytes.

In anticipation of "upgrade to current" I also just tried with 1.47 and same thing.

Do you see the traffic for the new client id in the developer console? And 1.45 is really old. Stick with 1.47

Also do you have multiple rclones syncing? And exceeding the per second limits?

I do see both IDs in the charts, yeah. I checked, and I'm only rcloning from one box and I had turned that off, the only transfers would be my morning test. You can see in the charts that there's nothing until 7am or so.

gdrive1

What's the full log of that as I'm missing if it keeps retrying or does it finally copy?

Queries per 100 is a good one to look at that gives a good idea as well.

Forgot to include the credential one. First block is when it was reset (3am eastern for me), second when I started manually transferring this morning. Then you see the pink 403s after a little bit. I think the 200s from yesterday were listings by me on the official webui because it certainly didn't work yesterday either.

Keeps retrying, resets the counter from 10 back to 1. Without debug on I can tell you it never finishes. I run my crons at 2 or 3 in the morning, by the time I can notice it's been at least 5 hours. It's not like I'm transferring much data, maybe 20GB or so as not much changes.

Shared with anyone else or anything else that can be hitting your 750GB limit? If it never finishes, it does seem quota related rather than anything else.

The only sharing I've done is with the official web app for sharing a handful of files. This is my real, personal account and I haven't shared the credentials (it'd show up in API anyway).

Are you within your storage limits for your account?

43T of unlimited

The only other thing you can check to try to figure out what is going on is checking the user's audit log.

It's in the GSuite Admin and you can check under Reports->Drive as that might give you a clue as you can filter for uploads.

You can try to get google to check your quota although I've not been too successful on that myself and you can always wait ~24 hours and see if it subsides as the quotas normally reset after that time period.

Yeah I noticed this issue 2 days ago, and I waited 24h for the issue to go away. I feel like I'm taking crazy pills.

Reports don't show much, the highest peak is 163k for scale. The audit log says two (2) items were created today.

number of files is a bit tough as they could be a bunch of big files. I wasn't sure if there was a way in the admin tool to see size of files uploaded.

Do you think rclone should be distinguishing betwen 403 and 429 errors? At the moment it uses the body of the error which is a JSON blob - these are the same for both errors. I only recently notices that there were 403 errors too. Perhaps the 403 errors should be treated as fatal and stop the sync?

Google's errors seem a little bit mysterious, at least to me...

I am not sure, but I think there might be some different 403 errors.

When using rclone copy with -vv, If I don't set a tps limit I see these:

error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=redacted, userRateLimitExceeded

But in my case these do not seem to stop the transfer.

As I have not used the -vv flag for a while, these may have been happening for a while.

I am using my own api keys, and the quotas appear well under the limits when i look in console.developers.google.com

I 'think' mine are related to the directory listings.

Sadly, that wouldn't be a good thing as a 403 could mean you just spamming the API too much and it backs off and continues. In this case, it retries and exponentially backs off and works like a charm.

The 403 when it hits the quota though is a bit more fatal as you can't transfer anymore (within reason as you can move files until you hit the limit but maybe not big ones).

Since they put them both for the same error, I'm not sure how to know which is which.