Sending data TO Gdrive: 403: Rate Limit Exceeded

Not everyone uploads >700GB/day and a lot of people just read and not contribute - This topic has 1.4k views!

It would be great if people shared any information regarding the bans then we can try and narrow down on the reasons, e.g is it purely data related? How long does the ban last for? etc.

Come on guys share your infos…

Got my 5 User GSuite acc banned after ~700-800 GB / 24 hours

I have had the same issue, seemingly blocked by 403: User rate limit exceeded, userRateLimitExceeded for ~24hrs after 700-900GB of upload.

450-500ish here. Haven’t been able to upload anything else even with my own api stuff.

I was wrong. Ban is not 12h - it may be reset at a fixed time. About midnight local time.

When i can upload again i will try to use only one checker, one upload at a time and limit tpsrate to 1/sek.

It seems very inconsistent. Sometimes i get to upload 700GB. Other times i get banned after 450 GB.

Anyone tried to limit bandwidth to see if that changes the amount of data you can upload before getting banned?

same problem for me. After ~ 600-700 GB over 20 connections i got the error. Can’t upload anymore since 4 hours but only with the system where i got banned. With Rclone Browser on my other (local) maschine its working. I dont understand why.

Its not only the Ip. I changed it for the instance but still ban.

Normal 1 Account user here. Same issues for me.

Has anyone tried to upload a bigger amount of data through “Backup & Sync” or via WebInterface? Would be interesting to know if they also ban you for this.

My internet is not fast enough at the moment to try it out.

I tried it via the web uploader and i couldn’t upload at all

1 Like

After or before being banned?

Webui has no data limits but they throttle it like crazy, so not worth using.

after. I’m still banned and it’s been close to 20ish hrs

750-800GB and 24h banned. Tested with --tpslimit 8 (~60MB/s) and without it (~340MB/s). Same results.

That’s alot of storage to be sending in a day, max i do is like 100GB, at 1TB+ a day I mean are you backing up a data center geez I’ve not had a problem.

A shy “me too” from here.

Just for the stats.
Rclone/Linux/commandline isn’t my world and most of the time I’m not really sure what I’m doing - so I better keep quiet and just-reading.
I’m using rclone with the rclone-browser-gui. And there it simply stucks - no 403.

Due to my slow internet I won’t run into this normally. I was reorganizing my data (a secret on our paid Google Suites and unencrypted on a Google education account from ebay) via a VPS when I noticed this issue.

It seems like Duplicati (with it’s own api) is still uploading normally - but I’m not sure. I can’t see the speed or the size of the data that Duplicati wants to push to google-drive.

I REALLY appreciate your investigation and sharing your informations, thoughts and results.
Thanks for that to all of you!

Do you guys think Google enforced this limit just for RClone users?`

What do real companies using G Suite upload their stuff with? Maybe Google found a way to just limit RClone to get rid of all “non-company” customers?

It does not appear to be Rclone specific, I can hit exactly the same limits now with goodsync or other tools…

I just got api banned again after uploading 500GB

Transferred: 622.636 GBytes (6.056 MBytes/s)
Errors: 2
Checks: 135806
Transferred: 1841
Elapsed time: 29h14m35.7s

No Ban … (also no tpslimit set) (Transfer from Google unlimited to Amazon unlimited)

Odd dev but it’s working for me now with --tpslimit 6. Knock on wood

If you have a little spare bandwidth available for test purposes this is a little test you can do for some insights of your own. You will need to be able to upload faster than 10MB/s (750GB/day) to trigger the ban.

  1. Create a new user for the test at https://admin.google.com/.
    Else, make sure the google drive account you are going to use for the test has not been uploaded to for at least 24 hours.

  2. Create a new rclone config for the test, name it ban-size-test (blank client_id/client_secret).

    $ rclone config
    
  3. Transfer some bytes to the new drive - 15TB in 1GB sparse files should give us some testing headroom.
    The sparse files will occupy ~100MB of local disk space only (ext4)

    $ date >> ban-size-test.log
    $ mkdir -p ban-size-test
    $ for x in `seq -w 1 15000`; do dd if=/dev/zero of=ban-size-test/file-$x.GB bs=1k seek=1M count=1; done | tee -a ban-size-test.log
    $ rclone copy ban-size-test ban-size-test:ban-size-test --transfers=30 --stats 10s -vvv  2>&1 | tee -a ban-size-test.log
    

    Feel free to add i.e. --tpslimit 6 to the command line if you are testing with the newest rclone (v1.37). The end result will be the same.

  4. Wait for the 403 error to happen in the output.

  5. Kill the rclone process with ^C.

  6. Inspect the test result: the ban size.

    $ rclone lsl ban-size-test:ban-size-test -vvv 2>&1 | tee -a ban-size-test.log
    $ rclone size ban-size-test:ban-size-test -vvv 2>&1 | tee -a ban-size-test.log
    
  7. Restart the rclone copy process and observe how the ban is being lifted and re-applied over the next days.

    $ rclone copy ban-size-test ban-size-test:ban-size-test --transfers=30 --stats 10s -vvv  2>&1 | tee -a ban-size-test.log
    
  8. Redo 5) and 6) once your patience runs out.

  9. Inspect the ban-size-test.log log file. The timestamps of the transfered files should leave you
    with a good impression of how many GBs were transferred when.

    $ less ban-size-test.log
    
  10. Clean up.

    $ rclone delete ban-size-test:ban-size-test -vvv 2>&1 | tee -a ban-size-test.log
    $ rclone rmdirs ban-size-test:ban-size-test -vvv 2>&1 | tee -a ban-size-test.log
    $ rm -rf ban-size-test
    
  11. Delete the test user again (at https://admin.google.com/).

  12. Consider pasting your findings here.

My expectations are that you will find the same as I keep finding.

At exactly 750GB transferred, no new files will be allowed to be uploaded any longer. The error we discuss here will now start showing in the output:

DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded, userRateLimitExceeded)

At this point, only the transfers that are already in progress will be allowed to finish. As I am using the parameter --transfers=30 in the command line, so the last file that will be transferred is file-00779.GB. All the remaining transfers will now be stalled, and will be reported as 0% done, 0 Bytes/s, until the ban is lifted:

*                                  file-00780.GB:  0% done, 0 Bytes/s, ETA: -
*                                  file-00781.GB:  0% done, 0 Bytes/s, ETA: -
*                                  file-00782.GB:  0% done, 0 Bytes/s, ETA: -
...

$ rclone size ban-size-test:ban-size-test
Total objects: 779
Total size: 779.001 GBytes (836445678592 Bytes)

If you are willing to accept the ban now and then, maybe you will be able to use this to your advantage. Simply, get all those huge linux isos in line, and start as many simultaneous transfers as possible. I.e. if they are well above 50GB each and there are 200 concurrent transfers going when the ban hits, that should leave you able to still upload ~10TB - if that is what is left to do for the transfers already in progress :slight_smile: The maximum file size for google drive is 5TB so for backup purposes this strategy could be elaborated quite a bit.

So, back to the test conclusion. On one hand you could be experiencing an exact 750GB ban, and on the other hand another user would tell you “no way that can be true! I just uploaded XX terabytes today - and I’m still not banned, my transfers are still running?!”. And, you would both be right :slight_smile:

5 Likes