Sending data TO Gdrive: 403: Rate Limit Exceeded

Yes @calisro - 9.4MB/s for 24 hours will hit the cap. You need to use bwlimit.

oppiz2’s reply from Google indicates a separate, more hidden, limit:

If there is a special quota for write operations so maybe. It is worth a try.

The limit is something like 2 files per second creation/modify/deletion.

Further development from Google:

The support team is unaware of any limitations. They are waiting for an answer from the Drive Team and the Drive API Team and this answer will come in 1-2 days. They’ll call me when it’s the case.

The support representative is also considering this being either a bug (:face_with_raised_eyebrow:) or large-scale maintenance since they got no documentation on it.

Furthermore, he mentioned that the limitation is user-based, not credentials based. I tested this and found it to be true. I got rate limited and then switched to fresh OAuth credentials and was still rate limited.

1 Like

Lol, let’s hope the best…Even if I can’t imagine it’s a bug.

I’ve also tested and ensured that this is Account Based not OAuth based limit. From my experience the ban is lifted within 24 hours.

I bug would be nice, but don’t see it.

I think even if Google really is limited to 750GB per day is still very good!99.9% of users here with such high data consumption will ony hoard porn or Warez content

I have closed my support case at google. They wanted me to document a lot of things regarding development. I told him that i am only an enduser but lots of people are experiencing the same issues.

"I understand you are not the developer of the third party tools but just a user and as so, you cannot provide the information requested to consult with our product engineers. Very often, third party tools fail to optimize their code to prevent this limitations and that is why we do not provide support for third party tools.

Our internal limitations documented are the ones I shared with you yesterday. To dig deeper into the problem, is strongly recommended that the developer of the tool address the issue in stackoverflow or to get in contact with us.

Since there is nothing else we can do about it, I will proceed to close the case for administrative purposes. Please note that if you were to require more help with your G Suite account within the next 30 days, feel free to reply to this message and I will follow up with you."

Way to make assumptions. Normally I don’t hit 750Gb but I am right now because I’m moving a section of my stuff to a new gdrive account. So what should be a 2-3 day process will now be 2-3 weeks.

Well for me when i tried to move my entire 20TB to a new gdrive for backup with transfers=15 and bwlimit= 8M i hit the limit after 2 or 3hrs.

Transferred: 974.571 GBytes (7.989 MBytes/s)
Errors: 2
Checks: 0
Transferred: 1685
Elapsed time: 34h42m2s

I decided to do 1 sub-dir at a time and i started with the biggest one. I changed transfers to 2 and left bwlimit at 8

So far no ban…but this transfer will take forever…

1 Like

Uploaded with about 40-50 MB/s and hit the limit at ~ 760 GB. Rclone committed 3 GB om memoryt and I saw a steady API rate at 0,1 / sek.This is clearly a bw-limit. I am not surprised but it was worth a try.

1 Like

I have found no way that this is possible for me to do, so far. appears to simply limit the transfer speed up front, so that the 750GB/day threshold is not possible to cross.

Maybe someone else had better luck than me?

Not judging your usage but I feel this is the reason that google did this anyway. Google provides unlimited storage and with that unlimited, they need to make sure that someone doesn’t come along and upload/clone petabytes of data all at one time. If they limit the total per day, they can easily monitor and scale based on a maximum growth rate and limit the usage of these virtual google compute platforms cloning massive amounts of data in short periods of time. I would be in the same boat as you if/when I have to move things around…

Still no ban

Transferred: 1368.782 GBytes (7.986 MBytes/s)
Errors: 3
Checks: 0
Transferred: 2604
Elapsed time: 48h45m2s

1 Like

What exactly are you doing? I could try it now.

rclone copy --fast-list -v --size-only -transfers=2 --bwlimit=8M --min-size=100M source:Path/to/Source dest:Path/to/dest

I think that is the command i used…It will be long since i think that folder is 17TB

2017/08/17 16:27:52 INFO :
Transferred: 1415.118 GBytes (7.986 MBytes/s)
Errors: 3
Checks: 0
Transferred: 2753
Elapsed time: 50h24m2s

In my case the total data storage isn’t changing. I’m just pulling it off one gdrive account and putting it in another one. So total data stored is staying the same.

im transferring from one gdrive to another