Sending data TO Gdrive: 403: Rate Limit Exceeded

How is the bandwidth limiting going? I could live with that.

My guess is there is no point in bw limiting - the limits are per day, so it doesn’t matter if you go at 30MB/s or 200MB/s, once you hit the cap that’s it until reset time.

750GB/day ~ 8.7MB/s

If “voluntarily gentle” comes across as preferable to “full forced stop”, one idea is to keep below this cap at all times.

I have begun to simply do all my Google Drive batch work from behind a 100Mbit/s interface, that keeps me safe so far.

Else the rclone built-in bandwidth limiting might be helpful, i.e: --bwlimit “06:00,8M 00:00,8M”.

After the api reset today still cant upload ,download works fine.

I’m having the same issues… Uploaded around ~2.3 TB before being rate limited…

And I got download banned after downloading just ~2.5TB so i think they have new download limits too

I got cutoff uploading at 750 GB to a G apps for education account. Looks like a 24 hour ban because it is working now.

I’m also getting it from an Arq backup on a regular paid G drive and have been having all kinds of problems uploading to a team drive via rclone.

The official Google Sync client is also running at a crawl on a different G apps for education account and the web interface just timed out when trying to access a file and when uploading to the team drive.

Whatever they are doing seems to be fairly indiscriminate.

I am also getting banned with User rate limit exceeded, userRateLimitExceeded. Then upload is blocked, but browsing and downloading is permitted.

It started yesterday.

Lars.

Have not uploaded a single byte in weeks, yet I keep getting “… IO error: bad response: 403: 403 Forbidden”

I suspect this was Plex/sonarr/radarr’s fauth as I had previously no folder cache layer in between me and Gdrive (now using plexdrive) but so far I have been getting this error for past week. Though it was just last night I fould a post about plexdrive roughly 14~15 hours ago.

@kites Yes this is an API ban because you are not limiting API requests. Plexdrive fixes this issue, use it with fuse, join us on Slack

https://plexdrive.slack.com/shared_invite/MTg1NTg5NzY2Njc4LTE0OTUwNDU3NzAtMjJjNWRiMTAxMg

@Qwatuz do you have an active invite to that? Says it’s no longer active for me.

There’s a link to it on plexdrive’s github page https://github.com/dweidenfeld/plexdrive that should work.

I am still experiencing these issues across 2 accounts. Neither let me upload more than 1TB/day.

Limit seems to be 1TB per day and a 12h ban. I began to upload right after the ban lifted and got blocked again after 475 GB.

Had a chat with a google supporter.

First they said that there is a global limit for gmail and gdrive and send me this link https://support.google.com/a/answer/1071518?hl=en

After some time they said that there is an issue on google drive: “When was this issue started. Because we actually have Google Drive Internal Server error”.

Later they said this: “The outage actually started since yesterday.”.

So it seems there is an issue but my request got redirected to the gdrive department and now I’m waiting for a final answer.

Hi,

Maybe Google didn’t sized correctly their infrastructure ( Design, Services and Offers ) and now, they didn’t know how answer to the demand (Around the World) without more investissement… simply.

I wonder if it’s a ratio/quota by country/region :slight_smile:

750mb per hour?

Thats less than one file!

No issues for Google Drive it seems?

https://www.google.com/appsstatus

A mixup with the Google Cloud Storage service perhaps?

https://status.cloud.google.com/

Google Cloud Storage Incident #17003
GCS triggers not fired when objects are updated
Incident began at 2017-08-08 11:44 and ended at 2017-08-08 13:48 (all times are US/Pacific).
1 Like

Had a 2nd chat with one of the supporters from google.

“XYZ 11:12 AM
Thanks for holding. Here are the limits, along with supported file types https://support.google.com/drive/answer/37603
XYZ 11:13 AM
This seems to be an issue when using rclone
XYZ 11:13 AM
Reviewing this article https://developers.google.com/drive/v3/web/handle-errors#403_rate_limit_exceeded. You can Batch the requests https://developers.google.com/drive/v3/web/batch or implement exponential backoff
XYZ 11:14 AM
In June 2017 a quota for creating blobs of 100Gb/day was established. It’s possible to create files of bigger size, but after this quota is exceeded all subsequent blob creation operations will fail
XYZ 11:15 AM
at this moment, I highly recommend using the drive sync tool if it is possible, to avoid this issue. Here is the link for the tool : https://support.google.com/a/answer/2490101?hl=en
XYZ 11:16 AM
Rclone have infoarion regarding Drive Limitations https://rclone.org/drive/. There also seems to be infoamtion regarding this on Github https://github.com/ncw/rclone/issues/76

I ask him if there is a limit for uploading XXX GB per 24 hours. This is the answer: “I have double checked and there is no limitations regarding that, as per this article : https://support.google.com/drive/answer/37603?visit_id=1-636381266873867728-2759172849&rd=1

@ncw what do you think?

I’ve also been chatting with support a few times over the past couple of days, the support staff are either in the dark about the limitations or are instructed to not share that information. They could not confirm 10TB/day download (been that way for months).

Yep same here… It seems that they aren’t allowed to share anything about this. Multiple chats and everytime I got another “info”… I don’t see any issue on the rclone side.

But I’m wondering why only a few users see this issue or talk about it.

1 Like