Totally agree. I didn’t make those test but I was supposed it already.
Hey thanks for the group post @steffenpoulsen! One thing I have noticed is once you hit the limit, any transfers that are still in progress get severely throttled - 200MB has an ETA of 1 day for example. Do you not have this issue?
I can confirm 750 GB. I checked usage in admin console and I could upload 752.8 GB before getting banned.
My ban is lifted at 00:30 GMT+1 (plus daylight saving)
i just started running int this problem yesterday : (
where exactly in gsuite can you see the days data upload total?
This is not possible to my knowledge. But yes, it would be very helpful if the gui and the api had a “remaining upload quota for today” counter available
In the test I upload all files to a dedicated directory for the test and then check the size of just this dir, using
rclone size. So, if nothing else was uploaded to the drive in the meantime, that would be the days upload total.
You cannot see upload rates etc. But you can see usage in gigabytes in googles admin console. Check before and after ban and you can see how much transferred.
Thanks, you’re welcome And you are right, what a good find! I can see that this is indeed the case for me also, I just doublechecked the logs.
Following the trace of one of the transfers after the ban kicks in, it looks like this - you are indeed right, it is severely throttled. So much for the just-keep-uploading-a-lot-of-5TB-files scenario, I guess
No ban yet:
2017/08/13 20:59:49 INFO : * file-00776.GB: 6% done, 3.435 MBytes/s, ETA: 4m39s
Ban just kicked in:
2017/08/13 21:00:09 INFO : * file-00776.GB: 10% done, 3.001 MBytes/s, ETA: 5m3s 2017/08/13 21:00:19 INFO : * file-00776.GB: 12% done, 2.316 MBytes/s, ETA: 6m26s 2017/08/13 21:00:39 INFO : * file-00776.GB: 16% done, 2.069 MBytes/s, ETA: 6m53s 2017/08/13 21:01:09 INFO : * file-00776.GB: 22% done, 1.923 MBytes/s, ETA: 6m51s 2017/08/13 21:01:39 INFO : * file-00776.GB: 28% done, 1.839 MBytes/s, ETA: 6m40s 2017/08/13 21:02:39 INFO : * file-00776.GB: 42% done, 1.642 MBytes/s, ETA: 6m0s 2017/08/13 21:04:29 INFO : * file-00776.GB: 64% done, 1.482 MBytes/s, ETA: 4m2s 2017/08/13 21:04:59 INFO : * file-00776.GB: 67% done, 1.322 MBytes/s, ETA: 4m8s 2017/08/13 21:05:59 INFO : * file-00776.GB: 79% done, 1.477 MBytes/s, ETA: 2m20s 2017/08/13 21:06:09 INFO : * file-00776.GB: 79% done, 776.140 kBytes/s, ETA: 4m34s 2017/08/13 21:06:29 INFO : * file-00776.GB: 79% done, 204.487 kBytes/s, ETA: 17m21s 2017/08/13 21:06:49 INFO : * file-00776.GB: 79% done, 53.875 kBytes/s, ETA: 1h5m53s 2017/08/13 21:07:09 INFO : * file-00776.GB: 79% done, 14.194 kBytes/s, ETA: 4h10m5s 2017/08/13 21:07:19 INFO : * file-00776.GB: 79% done, 7.285 kBytes/s, ETA: 8h7m13s 2017/08/13 21:07:39 INFO : * file-00776.GB: 79% done, 1.919 kBytes/s, ETA: 30h49m17s 2017/08/13 21:07:49 INFO : * file-00776.GB: 79% done, 1008 Bytes/s, ETA: 60h2m49s 2017/08/13 21:07:59 INFO : * file-00776.GB: 82% done, 1.660 MBytes/s, ETA: 1m46s 2017/08/13 21:08:10 INFO : * file-00776.GB: 85% done, 2.166 MBytes/s, ETA: 1m10s 2017/08/13 21:08:19 INFO : * file-00776.GB: 86% done, 1.940 MBytes/s, ETA: 1m10s 2017/08/13 21:08:29 INFO : * file-00776.GB: 89% done, 2.146 MBytes/s, ETA: 52s 2017/08/13 21:08:39 INFO : * file-00776.GB: 91% done, 2.336 MBytes/s, ETA: 37s 2017/08/13 21:08:49 INFO : * file-00776.GB: 92% done, 1.523 MBytes/s, ETA: 52s 2017/08/13 21:08:59 INFO : * file-00776.GB: 92% done, 800.397 kBytes/s, ETA: 1m42s 2017/08/13 21:09:09 INFO : * file-00776.GB: 92% done, 410.836 kBytes/s, ETA: 3m19s 2017/08/13 21:09:19 INFO : * file-00776.GB: 92% done, 210.878 kBytes/s, ETA: 6m28s 2017/08/13 21:09:29 INFO : * file-00776.GB: 92% done, 108.241 kBytes/s, ETA: 12m36s 2017/08/13 21:09:39 INFO : * file-00776.GB: 92% done, 55.559 kBytes/s, ETA: 24m34s 2017/08/13 21:09:49 INFO : * file-00776.GB: 92% done, 28.518 kBytes/s, ETA: 47m52s 2017/08/13 21:09:59 INFO : * file-00776.GB: 92% done, 14.638 kBytes/s, ETA: 1h33m16s 2017/08/13 21:10:09 INFO : * file-00776.GB: 92% done, 7.513 kBytes/s, ETA: 3h1m43s 2017/08/13 21:10:19 INFO : * file-00776.GB: 92% done, 3.856 kBytes/s, ETA: 5h54m1s 2017/08/13 21:10:29 INFO : * file-00776.GB: 92% done, 464.408 kBytes/s, ETA: 2m38s 2017/08/13 21:10:39 INFO : * file-00776.GB: 93% done, 592.417 kBytes/s, ETA: 1m50s 2017/08/13 21:10:49 INFO : * file-00776.GB: 93% done, 304.082 kBytes/s, ETA: 3m35s 2017/08/13 21:10:59 INFO : * file-00776.GB: 93% done, 156.082 kBytes/s, ETA: 6m59s 2017/08/13 21:11:09 INFO : * file-00776.GB: 93% done, 80.115 kBytes/s, ETA: 13m38s 2017/08/13 21:11:19 INFO : * file-00776.GB: 93% done, 41.122 kBytes/s, ETA: 26m33s 2017/08/13 21:11:29 INFO : * file-00776.GB: 94% done, 399.763 kBytes/s, ETA: 2m23s 2017/08/13 21:11:39 INFO : * file-00776.GB: 94% done, 205.194 kBytes/s, ETA: 4m39s 2017/08/13 21:11:49 INFO : * file-00776.GB: 94% done, 105.324 kBytes/s, ETA: 9m4s 2017/08/13 21:11:59 INFO : * file-00776.GB: 94% done, 54.062 kBytes/s, ETA: 17m40s 2017/08/13 21:12:09 INFO : * file-00776.GB: 94% done, 27.749 kBytes/s, ETA: 34m26s 2017/08/13 21:12:19 INFO : * file-00776.GB: 94% done, 14.243 kBytes/s, ETA: 1h7m6s 2017/08/13 21:12:29 INFO : * file-00776.GB: 94% done, 7.311 kBytes/s, ETA: 2h10m43s 2017/08/13 21:12:39 INFO : * file-00776.GB: 94% done, 3.752 kBytes/s, ETA: 4h14m41s 2017/08/13 21:12:49 INFO : * file-00776.GB: 94% done, 1.926 kBytes/s, ETA: 8h16m10s 2017/08/13 21:12:59 INFO : * file-00776.GB: 95% done, 290.988 kBytes/s, ETA: 2m48s 2017/08/13 21:13:09 INFO : * file-00776.GB: 95% done, 149.361 kBytes/s, ETA: 5m29s 2017/08/13 21:13:19 INFO : * file-00776.GB: 95% done, 76.665 kBytes/s, ETA: 10m41s 2017/08/13 21:13:29 INFO : * file-00776.GB: 96% done, 472.040 kBytes/s, ETA: 1m26s 2017/08/13 21:13:39 INFO : * file-00776.GB: 97% done, 1.003 MBytes/s, ETA: 23s 2017/08/13 21:13:49 INFO : * file-00776.GB: 99% done, 1.283 MBytes/s, ETA: 6s 2017/08/13 21:13:59 INFO : * file-00776.GB: 99% done, 984.411 kBytes/s, ETA: 0s 2017/08/13 21:14:09 INFO : * file-00776.GB: 99% done, 505.288 kBytes/s, ETA: 0s 2017/08/13 21:14:19 INFO : * file-00776.GB: 99% done, 259.359 kBytes/s, ETA: 0s 2017/08/13 21:14:29 INFO : * file-00776.GB: 99% done, 133.126 kBytes/s, ETA: 0s 2017/08/13 21:14:39 INFO : * file-00776.GB: 99% done, 68.332 kBytes/s, ETA: 0s 2017/08/13 21:14:49 INFO : * file-00776.GB: 99% done, 35.074 kBytes/s, ETA: 0s 2017/08/13 21:14:59 INFO : * file-00776.GB: 99% done, 18.003 kBytes/s, ETA: 0s 2017/08/13 21:15:09 INFO : * file-00776.GB: 99% done, 9.240 kBytes/s, ETA: 0s 2017/08/13 21:15:19 INFO : * file-00776.GB: 99% done, 4.743 kBytes/s, ETA: 0s 2017/08/13 21:15:29 INFO : * file-00776.GB: 99% done, 2.435 kBytes/s, ETA: 0s 2017/08/13 21:15:39 INFO : * file-00776.GB: 99% done, 1.249 kBytes/s, ETA: 0s 2017/08/13 21:15:49 INFO : * file-00776.GB: 99% done, 656 Bytes/s, ETA: 1s 2017/08/13 21:15:59 INFO : * file-00776.GB: 100% done, 378 Bytes/s, ETA: 0s 2017/08/13 21:16:09 INFO : * file-00776.GB: 100% done, 194 Bytes/s, ETA: 0s
Running into a similar situation on my end. Attempted different parameters to alter the amount of api calls and no change, ban after about a TB/day. I have since set the --bwlimit=10M and havent had my uploads stop yet. My 23TB data backup is gonna take quite the extra while now though .
Transferred: 1020.280 GBytes (9.996 MBytes/s)
Elapsed time: 29h2m1.4s
Does anyone have a G Suite enterprise account? Would be interesting to know if these are limited aswell.
I’ve a GSuite account and confirm that it happens too.
Google banned gcollection.co domain. All accounts are disabled I think.
Still no answer from Google?
It seems I am still banned, after ~2 days.
[nathan@server rclone]$ ./rclone copy /stuff/ gcrypt:/–transfers=7 --stats 1s -vv --tpslimit 1 --bwlimit 9M
2017/08/14 12:56:12 INFO : Starting bandwidth limiter at 9MBytes/s
2017/08/14 12:56:12 INFO : Starting HTTP transaction limiter: max 1 transactions/s with burst 1
2017/08/14 12:56:12 DEBUG : rclone: Version “v1.37” starting with parameters ["./rclone" “copy” “/stuff” “gcrypt:/” “–transfers=7” “–stats” “1s” “-vv” “–tpslimit” “1” “–bwlimit” “9M”]
2017/08/14 12:56:12 DEBUG : gdrive: Saved new token in config file
2017/08/14 12:56:15 DEBUG : pacer: Rate limited, sleeping for 1.567401321s (1 consecutive low level retries)
2017/08/14 12:56:15 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2017/08/14 12:56:17 DEBUG : pacer: Resetting sleep to minimum 10ms on success
and then … pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
I can upload via the web API, but not via Rclone right now. I don’t do many TPS and I haven’t seen this before.
Actually, after fiddling around with it and waiting a bit longer, I was able to upload again. Chugging along at ~35MB/s. I usually don’t upload much, but I have some new backups I need to transfer.
did you let it go on for 5 minutes… some of those errors are normal
I contacted them yesterday. The employee did only tell me stuff I already knew (1TB limit for < 5 users and stuff like that). He then forwarded me to an “Admin Console Specialist”.
The specialist contacted me today and asked for a huge amount of information (about how I’m using the API and when the “error” appears). I’ll try to answer them to him but I’m not sure about everything as I’m not familiar with the Drive API and just use RClone for my stuff.
Would maybe be helpful if someone with knowledge could contact them and answer the questions. Then maybe all this will become a bit clearer.
- First and foremost, if you’re attempting to follow some documentation, please include it as a link or attachment.
- What is your OAuth 2.0 project’s client ID?
- Any relevant screenshots of the results you’re receiving when using the API you’ve configured.
- Have you verified that the project isn’t exceeding either daily or per user quota, by following the steps at https://support.google.com/cloud/answer/6158858? Provide a full browser screenshot of the Usage, for the last day, and Quotas tabs.)
- Does the connection fail on the first attempt to connect, or does it fail during the communication process?
- Have you implemented an exponential backoff approach, as explained at https://developers.google.com/admin-sdk/directory/v1/limits#backoff, when you receive an error?
- What Programming language and version of the client library are you using. For example: GData or API Client libraries.
- When did you encounter the issue? (exact date & time).
- What API scopes are you using?
- The actual URL that was posted to our server in this call.
- What is the HTTP Request you’re executing?
- Does the issue affect all accounts or just a specific one?
- If a limited number of accounts, specify the account email address(es), and one unaffected user.
- Relevant code extract.
- Output from the request.
- Provide HTTP Request and Response headers or full HTTP logs.
- For GDATA APIS, see http://code.google.com/apis/gdata/articles/debugging_client_libs.html for details on capturing HTTP traffic.
- For all other APIs, see the relevant code library for details on capturing HTTP traffic.
- Please use the following steps to see if you can duplicate the issue via the OAuth Playground. Any entry contained within quotes should be entered without the quotes:
- Navigate to https://developers.google.com/oauthplayground/.
- If listed, select the relevant API scope. Otherwise, enter the scope in ‘Input your own scopes.’
- Click ‘Authorize APIs.’
- You’ll be prompted to permit API access to your domain, so click ‘Allow.’
- You’ll see the response ‘HTTP/1.1 302 Found.’ Click ‘Exchange authorization code for tokens.’
- You should see ‘HTTP/1.1 200 OK.’
- Enter the request URI in ‘Request URI.’
- Append any relevant parameters to the ‘Request URI.’
- Click ‘Send the request.’
Is this what they want? Seems like a lot of work just for a simple error
Why bother? It seems Google is enforcing a 750GB limit per day. Not very much to do, and support isn’t aware or instructed to not talk about it.
It is clearly not an API quota.
750? Where the hell are you getting that number? There’s no concrete evidence that’s even the right number as indicated in this thread.
Yes, same questions for me.
If they have that limit, I would like to know about it (told by Google itself). I can’t imagine a reason for not telling us. I’m paying for this service. Limits are OK, especially for what they offer me for just 10$ but I want to know what I get for my money. And as it seems like they silently enforced this limit, for me that’s a no-go. If they told us before or even told us anything, I would be OK but enforcing it and then keeping everything secret in my opinion is not right.
Right. For me it was also around 750GB but never ecactly that value and this value is still something we got just by trying it out.