Sending data TO Gdrive: 403: Rate Limit Exceeded

Working it AJ1252 on the startup for it.

If it is still needed to transfer more than 750GB/24h to a single account, there might still be one possible strategy left for GSuite users.

The target account needs to be an administrator account, so it has the “Transfer ownership” option.

Now, if the need is to upload 75TB/24h, 100 temporary “transfer accounts” are needed.

Each account will still only be able to upload 750GB/24h - but when the transfers are done, the ownership of the uploaded files can be transferred to the administrator account and the temporary accounts can be deleted again.


https://support.google.com/a/answer/1247799?hl=en

Transfer ownership
Sign in to your Google Admin console.
Sign in using an administrator account.

From the Admin console dashboard, go to App > G Suite > Drive and Docs.

Click Transfer ownership.
Note: You need the Drive service privilege and the Data Transfer privileges to see this option.

In the From field, enter the current owner’s username and select their domain.

In the To field, enter the new owner’s username and select their domain.

Click Transfer Files.

yea but just dont works. Tried this multiple time - 95% i get “transfer failed” back.

Oh, sounds like I might have just been lucky :slight_smile: - did you already file a bug report against this experience?

I’m trying to copy data from one Gsuite Edu account to another on the same domain, the reason being, I need a space to store 400TB worth of data (yes, a datacenter backup) and apply some scripts to rename files which I cannot do with the original data. A single file is easily 160GB+. I’ve tried sharing a folder, then add to my drive , but this doesnt allow me to make changes to the files without affecting the original files. We do have local copies of the files but we can’t do anything destructive to them.

So i’m using a shared folder on the second account, and run rclone copy between the two folders, the log shows server side copies for the files.

DATE=date +%Y-%m-%d_%H_%M
rclone copy --fast-list --size-only -vv --transfers 2 --checkers 2 --drive-chunk-size=16384k --drive-upload-cutoff=16384k --tpslimit 5 --timeout 30s --stats 1m --log-file /home//backup-scripts/logs/rclone_copy_${DATE}.log gdrive:folder1/ gdrive:folder2/ & disown -a

I am hitting the 403 errors but still keep transferring though and at some point rclone just quits and the API key becomes unauthorized, more likely a ban. no use trying to change API keys since it just continues to be Error 403 due to user ban.

2017/09/29 01:56:45 INFO  :
Transferred:      0 Bytes (0 Bytes/s)
Errors:                 0
Checks:                 9
Transferred:            0
Elapsed time:     9m33.3s
Transferring:
 * 901pcok84t35t8k2vtdjsg49ns/0dqd8vvu3plh3d9u3d42s0a0js

2017/09/29 01:56:48 DEBUG : pacer: Rate limited, sleeping for 16.230841597s (39 consecutive low level retries)
2017/09/29 01:56:48 DEBUG : pacer: low level retry 9/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)

I think I’ll just wait it out until we can do damage on our last resort backup servers. It doesn’t help when one institution doesn’t have enough funds for cloud storage and the closest cloud storage provider can only commit 100+TB and the link between us is just 1Gbps.

Use this link and follow the rclone, rclone-union, and move script. We got it figured out. The bandwidth limit to maintain a 750GB transfer is properly implemented. Hope it helps you out.

I’ve recently been able to do ~930GB before it gives up.

Why do you need to reboot server ?

Any news about that, since setting bwlimit quite worthless as the limit is still 750GB so its same if you burst all data get banned and continue when unbaned.

I made a script to check when upload is reseted and on my account is daily at 6am, regardless when i get banned.

22.10.2017 04:02:41 ERROR: gdrivecrypt upload locked
22.10.2017 06:00:49 OK: gdrivecrypt upload unlocked
24.10.2017 01:31:45 ERROR: gdrivecrypt upload locked
24.10.2017 06:01:06 OK: gdrivecrypt upload unlocked
25.10.2017 04:31:50 ERROR: gdrivecrypt upload locked
25.10.2017 06:00:41 OK: gdrivecrypt upload unlocked
25.10.2017 15:02:00 ERROR: gdrivecrypt upload locked
26.10.2017 06:01:00 OK: gdrivecrypt upload unlocked

So atm my current solution is to disable upload and sync scripts when drive is locked.

I really hope google change this limit since 750GB is quite shitty for any serious usage.
Cant even imagine if i need to do fresh sync of my current 100TB library to another drive.
The old limitation was 10TB eg 10 days and thats quite ok, but now it would take 133 days to do it.

p.s. I hear good things regarding dropbox and unlimited plans, what are your guys experiences regarding it.

Dropbox unlimited needs minimum 3 users with $20/mo tho :frowning:

Is this 750GB daily limit for the entire google account or for each individual user account (eg. if you have 5 users under your google administration, then each user has 750GB daily limit).

I think it’s per API. So if every user has different API credentials, then it’s 750GB/user.

It’s per user account.

After beeing greeted by the infamous “403 RateLimitExceeded” warning I created my own Google API client_id following this suggestions from the doc.

Benchmark by syncing 35,500 files

  • Using rclone.org Google API client_id : 2min. 53 sec.
  • Using my own: 33 sec.

About 6 times faster. Definitely worth creating your own Google API Client_id.

1 Like

let me understand this right,
this only speed up the API requests? but doesn’t stop the 403: rate limit exceeded issue
right?

update:
I have rclone 1.36
but I cannot add the client ID and client secret
or if I’m adding it (manual, or with rclone config) this happens:

2017/12/12 21:47:10 Failed to create file system for “gdrive:Plex/blabla”: couldn’t read info about Drive: Get https://www.googleapis.com/drive/v2/about?alt=json: oauth2: cannot fetch to
ken: 401 Unauthorized
Response: {
“error” : “unauthorized_client”
}

Both in fact. It speeds up the “sync” process (lots of requests to stat the files on the remote drive) and reduces the rate limit issue by a large margin. For the “copy” option, it doesn’t increase the up load speed, but it reduces the number of pacer sleep-and-retry.

As for your “Athorized” error, check your credentials. I just followed the instructions from the doc and used rclone config to create a new drive. Just use your Google client ID and client Secret. The token will be automatically fetched by the script asking for your authorization in your browser. Here is the options I used in the script:

Storage> 9
Google Application Client Id - leave blank normally.
client_id> xxxxxxxxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.apps.googleusercontent. com
Google Application Client Secret - leave blank normally.
client_secret> xxxxxxxxxxxxxxxxxxxxxxxxx
Remote config
Make sure your Redirect URL is set to “urn:ietf:wg:oauth:2.0:oob” in your custom config.
Use auto config?
Say Y if not sure
Say N if you are working on a remote or headless machine or Y didn’t work
y) Yes
n) No
y/n> Y
If your browser doesn’t open automatically go to the following link: h t t p://127.0.0.1:53682/auth
Log in and authorize rclone for access
Waiting for code…
Got code
Configure this as a team drive?
y) Yes
n) No
y/n> N
[Google_test]
client_id = xxxxxxxxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.apps.googleusercontent. com
client_secret = xxxxxxxxxxxxxxxxxxxxxxxx
token = {“access_token”:“xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx”,“token_type”:“Bearer”,“refresh_token”:“xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx”,“expiry”:“2017-12-14T20:55:34.772606468+01:00”}
y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d>

Wouldn’t this require you to add users at $10 each? Or Gsuite has a 30-day “grace period”?

You are charged per user per day.

So if you delete the user later the same day, you will pay for one day of service only ($10/30 ~ $0.33).

In other words, $0.33 per 750GB extra upload :slight_smile:


https://support.google.com/a/answer/1247362

If you add or remove users during any month, we prorate your payments. If you add a user on April 1 and delete them on April 15, we charge you for only half a month of service.

Awesome. It worked. +500Gb uploaded and transfered to the main account.
Also, no changes on my billing subscription YET.
Google usually charges things instantly, so I hope there’s a policy that users added for less than a day won’t be charged.

1 Like

I wonder if they’ll crack down on that. Seems Google could throw the ToS at persons creating and deleting multiple accounts to get around per-user limits?

1 Like

Ha ha - I think you may not be the only one who wonders about that, tdaniels :slight_smile: … also, the more obvious one - when might Google start enforcing the 1TB limit per user when you have less than 5 users in your GSuite?

Only time will tell, I guess … in the meantime, I prefer to only entrust the D with data I am prepared to lose at any instance :slight_smile: