Sending data TO Gdrive: 403: Rate Limit Exceeded

For the past few days i’ve been getting error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded when sending data from my server to Google Drive.

Normally this only crops up when sending data FROM Google Drive which is really strange!

I am able to upload files via web UI to the Gdrive account no problem and also able to do a sync from it to a different gdrive account - So sending data out of this account is fine.

It seems I may heve been temp banned from uploading files to the account via rclone - due to too many API requests possibly?

I am using v1.36 with these commands: -c --no-traverse --transfers=10 --checkers=300 --stats=5s -v

I heard that another user was having the same issue yesterday - Has anyone else ran into this problem?.

I’ll be changing to v1.37 and testing out --tpslimit to see if I still run into the same issues tomorrow.

Here’s a debug log for the curious. :slight_smile:

https://pastebin.com/XsCyy9CX

1 Like

Some 403 errors are normal - rclone will just wait a bit and retry what it was doing.

Is it causing errors in the sync?

There is no data being transferred at all when I run the move command to send data to my gdrive from my server - even after leaving it for hours.

I can sync from the gdrive account to another no problem. So the issue is just sending data to the Gdrive account.

Oh, I see. This must be something up with drive, maybe an upload ban like you said.

I am experiencing similar issues with sending files to gDrive with v1.36.

What is interesting that this appears to be only happening on specific sync jobs. I run 3 sets of nightly backup scripts, 2 of 3 work flawlessly. The 3rd spits out the same error as yours.

One possible reason is that in my case the sync (#3) that is failing has thousands of nested sub directories with lots of content in each. If it is doing a read of each directory first, it could be maxing out the API connection limit and preventing uploading.

One possible reason is that in my case the sync (#3) that is failing has thousands of nested sub directories with lots of content in each. If it is doing a read of each directory first, it could be maxing out the API connection limit and preventing uploading.

I think this might be the reason although i’m not sure why sync from Gdrive A to Gdrive B does not have this problem.

Does sync have parameters to avoid getting API bans from dir scanning @ncw whereas Move or Copy do not?

I get banned when doing Move from server > Gdrive yet no bans from Sync Gdrive A to Gdrive B.

@rooker156 what is your command that gets you banned?

Tried deleting the empty dirs in my upload folder - got banned after ~60GB upload, compared to ~600GB the day before.

Sync, Copy and Move are the same code, with a few flags, so no.

1 Like

After doing some testing today with multiple users who are running into this problem it appears to happen after 600-900GB of uploaded content from a server before Error 403: Rate Limit Exceeded, rateLimitExceeded starts happening.

It’s not related to any API as my limit is increased to 20,000 per 100 seconds and the most I’ve reached is 750.

Looking back through my upload logs, it first started on the 4th of this month and has been happening daily since.

I am also getting the same issue when doing google drive > google drive backups, after 300-900GB transfers, 403 errors. (Previously this would be limited to 10TB/day download)

I’ve been in contact with gsuite support a couple times and they claim to know nothing about any limitations (including the 10TB/day download) and have pointed me towards https://developers.google.com/drive/v3/web/support

A quick way for someone to verify these limits is to simply try and upload 1TB of data to google drive or transfer data from one drive to another.

@exceeded I think it is out of our hands, as Google appear to have enforced limits. They do not publish such data and there is no way of seeing how close/far you are from hitting these limits (other than API limitations).

You can type vnstat -d to view data used in the past week (daily mode) or w for month, h for hourly. This is all server data and not just google drive/rclone.

The data reset can happen 3-6 hours or 12+. Not really sure how long it takes.

Hi Qwatuz,

Thanks for your feedback !

Is it means, we are blocked by only 1 " ID clients OAuth " or ALL the others ID clients are concerning by the same limitation (like all the account from the domain name?)

Is it a limit by ip address or id clients? or something more general?

Any idea about the space, do you know if somebody reach more than 105TB+ already or something around 200-300 ect?

It will be linked to your drive account the same way 403 download limits are. You can still upload through the webUI but the speed is very very slow and not worth it.

It means, if i try to upload with another ID OAuth from another platform, it will doesn’t work?

It will not but feel free to try it.

Yo. I’ll test this tonight. I’ve got about 1TB of data I need to upload and move. It’ll take me less than 8 hours to upload it. I <3 1Gbit upload speed.

I too have been experiencing these issues. I push about 2-3TB to Google Drive per day and noticed this morning that just a single file transfer was taking over an hour to copy. I have 440TB on Google Drive right now.

Cool_mundus means there is a “marge”

Since this morning, it works again, but the speed is around 50 MB/s before it was 100MB/s so i guess yes, something has been changed…

We will see how many TB i can upload before get the 403 error(s) again.

Please let me know if you got more news about it

For_Qwatuz: You right it’s linked to the domain used not only 1 ID OAuth :frowning:
But it’s strange, I thought the quota/request/by/seconds was about 1 ID OAuth only …

Is it linked to 1 user only ? example, if you got more than 1 user…

New stats for today:

GCE gdrive>gdrive transf got banned after 951GB - average was around 300MB/s for a large part of it,

Server>gdrive transf got banned after 795GB - speeds were 50-70MB/s on average.

I seemed to have been banned after 85GB uploaded. I got a single ISO up, which was encrypted, and I got a 403 Error.

Same issues here. Banned after 800gb with --tpslimit 8, around 60MB/s on average. My API calls are almost insignificant. It seems that Google are doing changes but I don’t know where yet.

I have the same issue .

yesterday i tried to uploaded a large backup with 6 TB on gdrive. stopped an 2 accounts @ ~2TB with error 403

today i tried to resume the backup stopped also an booth accounts @ ~750gb
i limited the bandwith on one account to 20M an on the other i pushed with 3 Gbit.