Sending data TO Gdrive: 403: Rate Limit Exceeded

You didn’t get ban because if you look at your timer, it’s pass the 24 hours each time, so after each 24 hour (Google Time) your limit get rest and as long as you don’t transfer more than 750Gb, it will keep continue

yup…it is all about the sweet spot i guess …lol…but man the transfer is going to take forever now

Yes, and that’s our new bottleneck, before the bottleneck was not pass 10Tb per day, now this one and there is no way to go around it.

This sucks alot for those that want to make backups of there data on multiple Gdrives like me…I just got one of my ebay gdrives band so im trying to move my content to make sure i have backups…unlike alot of people where i live, internet is very expensive so all i have is a 30/10 connection. So if i loose all my current content that is in gdrive, at 10Mbit/s it will take a very long time to put it all back…

I wish there were other alternatives to Gdrive and ACD.

Try the Google Cloud Compute, it won’t use your internet and it’s free for 1 year :slight_smile:

already have that, it is good for moving from google drive to google drive but it wont help moving from my local up to google drive

Ok, gyus. I just had a call with the support and was told the following:

The support is aware of us having problems of hitting the limit, they were contacted and had the same problem reported several times.

They’re quite sure the limit has something to do with the requests send by RClone. He told me the backend has somehow changed the last time and that the limit is enforced because all RClone users use the same requests.

I told him that I also tried my own client id and this made no difference, then he told me he is quite sure it has something to do with the software I’m using and suggested me to try to use my own client id together with own software with exponential backoff implemented (at the moment I don’t have time for that). If I still got the error with my own application, I should contact him again and he will forward everything to the drive team and they’ll have a look at it.

The last thing he suggested me is to motivate the RClone developer(s) to get in contact with the Google support to see what’s wrong about the RClone software together with Google drive.
As far as I see it, this is our best chance for fast results.

@ncw, it would very kind if you could take some time to contact the Google support and at least give it a try.

Does transfer between two Google Drives on GCC have 750g/day quota limit?

We could try to use googles file stream, but i don’t think it matters. I am with the beta so i will give it a go tonight,

It would also be nice if @ncw would contact google to sort out that rclone is not the issue.

Is not a Rclone problem with requests, I have rate limit with Synology CloudSync and multcloud, too. I didn’t get this before, so they configured a new limit in API.

Getting this too. I have uploaded several TBs without any problems or limitations over the last few weeks (at max speed) and was trying to upload a 1TB folder last night and got this after around 750GB. Following…

I know this but it seems like the Google support doesn’t want to know about it.
If we want to convince them, we either need to create our own software (with exponential backoff which then also should get the error) or someone with huge knowledge about RClone (probably just @ncw) has to discuss with them.

RClone already has exponential backoff implemented. And as others mentioned, other tools also exhibit this problem.

I can confirm its not just Rclone, Insync also appears to hit the limit as well.

I agree that it is not rclone. But if we can reproduce the ban with googles own Drive File Stream, then they can't blame the developer.

I also use Mountain Duck, however I don’t see any issues after the limit has been reached with rclone.

Also another thing I noticed; when my ban is lifted for uploading, rclone dedupe is still blocked. Perhaps each API call is rated and has a separate limit?

Can you upload files with mountain duck after rclone is blocked?? That is strange. When my account is blocked it hits everything. IOS Gdrive app, web-upload, ExpanDrive, Google Drive File Stream, rclone etc.

I can browse and download but every attempt to upload files are denied.

I got it confirmed today.

Hi XXX,

Thank you for taking my call.

As mentioned on our call, I can confirm that new separate per user limits are being introduced. Unfortunately, Google does not publicize all the Drive limits to prevent abuse of the API and as they are subject to change. At this time, I am aware of new limits being implemented which is causing the error you described and does not reflect in your projects statistics.

As mentioned, I am unable to confirm exact numbers as this is subject to change, however at this time it seems to be 750 GB/Day. This quota is on a per user basis.

From looking into this further, it would seem that blob type files in particular trigger this error.

When this limit is reached, it should be reset after 24 hours.

In order to lighten the load of your application, I would advise that you implement the recommended best practices: https://developers.google.com/drive/v3/web/practices In addition to this, I would also suggest following the suggested actions highlighted for ‘403: User Rate Limit Exceeded’ in the following article: https://developers.google.com/drive/v3/web/handle-errors#403_user_rate_limit_exceeded

Should you have any questions, please do not hesitate to let me know.

Sincerely,

xxxx
Google Cloud Support

1 Like

Thanks for sharing. I can’t understand why they say “max file upload size is 5tb” but it’s limited to 750gb :roll_eyes:

It’s rather easy to figure once I made a test. Made 5 really large random files (1TB each). Started uploading them all at the same time. The transfer is currently at 2TB (out of 5) and still going.
It appears that files that this threshold won’t break active sessions and leaves them to finish.